maximhq AI
bifrost
Fastest enterprise AI gateway (50x faster than LiteLLM) with adaptive load balancer, cluster mode, guardrails, 1000+ models support & <100 µs overhead at 5k RPS.
4,892 stars 586 forks Last commit today Language Go License Apache-2.0
Install
go install github.com/.../bifrost@latest About
Fastest enterprise AI gateway (50x faster than LiteLLM) with adaptive load balancer, cluster mode, guardrails, 1000+ models support & <100 µs overhead at 5k RPS.
More AI servers
-
Brave Search
Web + local-business search via the Brave Search API — privacy-first, no Google account required.
-
Memory
Knowledge-graph memory — store entities, observations, and relations across conversations.
-
Sequential Thinking
Structured chain-of-thought scaffolding — the model writes its own multi-step plan as MCP calls.
Discussion (0)
Sign in to comment →No comments yet. Sign in to start the discussion.