LLM Gateway is an open-source middleware that consolidates interactions with multiple LLM providers—such as OpenAI, Anthropic, Google Vertex AI—behind a single, unified API compatible with OpenAI's spec. Designed for both self-hosted and cloud use, it enables developers to route requests dynamically, secure and manage API keys, monitor token usage and costs, and analyze performance metrics. With optional UI, telemetry, and Docker deployment, it's ideal for teams aiming to centralize LLM orchestration and gain visibility into AI usage.
Features
- Unified OpenAI‐style API for multi‐provider support
- Dynamic routing of requests to selected LLM services
- Centralized API key management and credential storage
- Detailed analytics on usage, tokens, latency, and costs
- Performance comparison across models and providers
- Easy deployment via Docker (single or compose setup)
License
MIT LicenseFollow LLM Gateway
Other Useful Business Software
Application Monitoring That Won't Slow Your App Down
Full APM with errors, performance, logs, and uptime monitoring. 99.999% uptime SLA on the platform itself.
Rate This Project
Login To Rate This Project
User Reviews
Be the first to post a review of LLM Gateway!