Composio equip's your AI agents & LLMs
The Memory layer for AI Agents
BISHENG is an open LLM devops platform for next generation apps
Central interface to connect your LLM's with external data
A Model Context Protocol (MCP) server implementation
Query MCP enables end-to-end management of Supabase via chat interface
Connect any Open Data to any LLM with Model Context Protocol
K8s-mcp-server is a Model Context Protocol (MCP) server
Evaluation and Tracking for LLM Experiments
CodeGeeX2: A More Powerful Multilingual Code Generation Model
CLIP, Predict the most relevant text snippet given an image
Supercharge Your LLM Application Evaluations
A Tree Search Library with Flexible API for LLM Inference-Time Scaling
Semantic cache for LLMs. Fully integrated with LangChain
Tongyi Deep Research, the Leading Open-source Deep Research Agent
Helping you get the most out of AWS, wherever you use MCP
Build resilient language agents as graphs
Inference framework for 1-bit LLMs
Optimizing inference proxy for LLMs
Go ahead and axolotl questions
Leveraging BERT and c-TF-IDF to create easily interpretable topics
Pruna is a model optimization framework built for developers
Fault-tolerant, highly scalable GPU orchestration
HexStrike AI MCP Agents is an advanced MCP server