A scalable inference server for models optimized with OpenVINO
Last9 MCP Server
csghub-server is the backend server for CSGHub
WhatsApp MCP server enabling AI access to chats and messaging
Easiest and laziest way for building multi-agent LLMs applications
Personal AI, On Personal Devices
Lemonade helps users run local LLMs with the highest performance
Official MiniMax Model Context Protocol (MCP) server
Specification for multi-provider, interoperable LLM interfaces
AI search engine - self-host with local or cloud LLMs
Perplexica is an AI-powered answering engine.
DeepSeek 4 Flash local inference engine for Metal
AI assistant that supports knowledge bases, model APIs
Full-stack Open-source Self-Evolving General AI Agent
Mac app for Ollama
An MCP server that autonomously evaluates web applications
Python-free Rust inference server
Kubernetes Controller for building, testing and deploying MCP servers
Open source alternative to ChatGPT that runs 100% offline
A mcp server to allow LLMS gain context about shadcn ui component
A simple, secure MCP-to-OpenAPI proxy server
Fast, local-first web content extraction for LLMs
Dive is an open-source MCP Host Desktop Application
Local Groq Desktop chat app with MCP support
AI chat for any model