The most powerful MCP Slack Server with no permission requirements
Your Personal Research Multi-Tool
csghub-server is the backend server for CSGHub
Full stack framework for building cross-platform mobile AI apps
CLI proxy that reduces LLM token consumption
Control Gmail, Google Calendar, Docs, Sheets, Slides, Chat, Forms
Masks sensitive data and secrets before they reach AI
An efficient forwarding service designed for LLMs
Manages Unified Access to Generative AI Services
Distributed LLM and StableDiffusion inference
LLM Frontend for Power Users
Plugin for JADX to integrate MCP server
Drag & drop UI to build your customized LLM flow
WebAssembly binding for llama.cpp - Enabling on-browser LLM inference
Completely free, private, UI based Tech Documentation MCP server
Fast, flexible LLM inference
Flutter-based cross-platform app integrating major AI models
Open-Source Alternative to Context7, Nia, and Ref.Tools
ChatGLM3 series: Open Bilingual Chat LLMs | Open Source Bilingual Chat
AI assistant that supports knowledge bases, model APIs
Fully private LLM chatbot that runs entirely with a browser
Quick illustration of how one can easily read books together with LLMs
Bringing large-language models and chat to web browsers
Query anything (GitHub, Notion, +40 more) with SQL and let LLMs
Run AI models locally on your machine with node.js bindings for llama