LangChain for Rust, the easiest way to write LLM-based programs
The Rust workspace under rust/ is the current systems-language port
Rust native ready-to-use NLP pipelines and transformer-based models
Fast ML inference & training for ONNX models in Rust
A minimal, secure Python interpreter written in Rust for use by AI
Semantic search and document parsing tools for the command line
Graph-vector database for building unified AI backends fast
Fast and efficient unstructured data extraction
Rust framework for building modular and scalable LLM-powered apps
Serialize repositories into LLM-ready context w/ smart prioritization
Instant, controllable, local pre-trained AI models in Rust
An improved implementation of the Ralph Wiggum technique
Python-free Rust inference server
Your favorite Terminal Coding Agent, now in Rust
Convert codebases into structured prompts optimized for LLM analysis
157 models, 30 providers, one command to find what runs on hardware
High-performance API combining reasoning and creative AI models
Delivery infrastructure for agentic apps
The AI framework that adds the engineering to prompt engineering
Local AI coding agent CLI with multi-agent orchestration tools
Local, policy-gated signing and wallet management for every chain
Framework to prove inference of ML models blazingly fast
Visual AI IDE for building agents with prompt chains and graphs
High-performance inference server for text embeddings models API layer
Open source AI wearable platform for recording and summarizing speech