ID-based RAG FastAPI: Integration with Langchain and PostgreSQL
Openai style api for open large language models
Connect MATLAB to LLM APIs, including OpenAI® Chat Completions
The unofficial python package that returns response of Google Bard
Run Local LLMs on Any Device. Open-source
Operating LLMs in production
Gorilla: An API store for LLMs
A list of free LLM inference resources accessible via API
Distribute and run LLMs with a single file
Universal MCP-Server for your Databases optimized for LLMs
Optimizing inference proxy for LLMs
Self-hosted, community-driven, local OpenAI compatible API
Composio equip's your AI agents & LLMs
Bench is a tool for evaluating LLMs for production use cases
Interact with your documents using the power of GPT
Open source alternative to ChatGPT that runs 100% offline
Model Context Protocol server for GraphQL
A Tree Search Library with Flexible API for LLM Inference-Time Scaling
Private Open AI on Kubernetes
Semantic cache for LLMs. Fully integrated with LangChain
The only fully local production-grade Super SDK
Run local LLMs like llama, deepseek, kokoro etc. inside your browser
Easiest and laziest way for building multi-agent LLMs applications
Ray Aviary - evaluate multiple LLMs easily
Speech-AI-Forge is a project developed around TTS generation model