MCP Server for interacting with manifest v2 compatible browsers
Plugin for JADX to integrate MCP server
Fast stable diffusion on CPU and AI PC
Run local LLMs like llama, deepseek, kokoro etc. inside your browser
High-performance browser automation bridge and orchestrator
Dockerized FastAPI wrapper for Kokoro-82M text-to-speech model
InvokeAI is a leading creative engine for Stable Diffusion models
Socket based MCP Server for Ghidra
Browser action engine for AI agents. 10× faster, resilient by design
Deep Research framework, combining language models with tools
Model Context Protocol server that integrates AgentQL's data
Mobile and Web client for Codex and Claude Code, with realtime voice
A clean web dashboard for OpenClaw
Deploy OpenClaw with one click
Talk with Azure using MCP
WebAssembly binding for llama.cpp - Enabling on-browser LLM inference
Python Telegram bot api.
ChatGLM3 series: Open Bilingual Chat LLMs | Open Source Bilingual Chat
Java enterprise application development framework
DevoxxGenie is a plugin for IntelliJ IDEA that uses local LLM's
Self-contained, offline survival computer with tools, knowledge, & AI
The open source coding agent
Google Flights MCP and Python Library
A TypeScript SSE proxy for MCP servers that use stdio transport
Helps developers deploy LangChain runnables and chains as a REST API