TensorRT LLM provides users with an easy-to-use Python API
Turn any GitHub repository into an MCP documentation server for AI
ChatGLM3 series: Open Bilingual Chat LLMs | Open Source Bilingual Chat
Streamline your ML workflow
An MCP server for interacting with Google Colab
Kimi Code CLI is your next CLI agent
Specification for multi-provider, interoperable LLM interfaces
Open-source MCP server that gives your coding agent
TensorFlow is an open source library for machine learning
Fully private LLM chatbot that runs entirely with a browser
A Model Context Protocol (MCP) Gateway & Registry
A Discord music bot that's easy to set up and run yourself
SDK for building interactive UI components over MCP for AI tools
Openai style api for open large language models
A lightweight, lightning-fast, in-process vector database
Bringing large-language models and chat to web browsers
Browser action engine for AI agents. 10× faster, resilient by design
Elyra extends JupyterLab with an AI centric approach
Rust async runtime based on io-uring
Fast ML inference & training for ONNX models in Rust
GPU accelerated decision optimization
Supercharge Your LLM with the Fastest KV Cache Layer
Interactively analyze ML models to understand their behavior
Agent framework and applications built upon Qwen>=3.0
Fast and efficient unstructured data extraction