Run LLMs locally on Cloud Workstations
Model Context Protocol Server for Apache OpenDAL™
Lemonade helps users run local LLMs with the highest performance
Personal AI, On Personal Devices
WhatsApp MCP server enabling AI access to chats and messaging
A Model Context Protocol (MCP) server implementation for DuckDB
An official Qdrant Model Context Protocol (MCP) server implementation
Query MCP enables end-to-end management of Supabase via chat interface
A Model Context Protocol (MCP) server implementation
An MCP (Model Context Protocol) server implementation
A Model Context Protocol (MCP) server
An MCP server for interacting with Google Colab
An MCP server that autonomously evaluates web applications
A Model Context Protocol server for searching and analyzing arXiv
A Model Context Protocol Server for Home Assistant
ChatGLM3 series: Open Bilingual Chat LLMs | Open Source Bilingual Chat
Supercharge Your LLM with the Fastest KV Cache Layer
Easiest and laziest way for building multi-agent LLMs applications
Run all your local AI together in one package
Terminal-based LLM chat tool with multi-model and local support
Explainability and Interpretability to Develop Reliable ML models
Elyra extends JupyterLab with an AI centric approach
A high-quality rapid TTS voice cloning model
Private chat with local GPT with document, images, video, etc.
Dockerized FastAPI wrapper for Kokoro-82M text-to-speech model