Supercharge Your LLM with the Fastest KV Cache Layer
Browser MCP is a Model Context Provider (MCP) server
Self-contained, offline survival computer with tools, knowledge, & AI
TensorRT LLM provides users with an easy-to-use Python API
Take control of your AI agents
Browser action engine for AI agents. 10× faster, resilient by design
An open-source, modern-design AI training tracking and visualization
MCP Server for kubernetes management and analyze workload status
The open source post-building layer for agents
Completely free, private, UI based Tech Documentation MCP server
Python-free Rust inference server
ChatGLM3 series: Open Bilingual Chat LLMs | Open Source Bilingual Chat
Large Language Model Text Generation Inference
Framework and no-code GUI for fine-tuning LLMs
OCR model for complex documents with layout-aware structured outputs
A lightweight, lightning-fast, in-process vector database
Browser automation for AI agents and humans
Specification for multi-provider, interoperable LLM interfaces
A Discord music bot that's easy to set up and run yourself
The AI-Native Search Database
Enterprise AI agent platform for workflows, models, and RAG apps
Open source alternative to ChatGPT that runs 100% offline
Context data platform for building observable, self-learning AI agents
Rust async runtime based on io-uring
GPU accelerated decision optimization