MLOps simplified. From ML Pipeline ⇨ Data Product without the hassle
Deep universal probabilistic programming with Python and PyTorch
A Model Context Protocol (MCP) server
Documentation for the Krixik Python client
Official inference repo for FLUX.2 models
Code for running inference with the SAM 3D Body Model 3DB
A Unified Library for Parameter-Efficient Learning
Industrial-strength Natural Language Processing (NLP)
Efficient Triton Kernels for LLM Training
Context-aware AI Sales Agent to automate sales outreach
3D reconstruction software
An Autonomous LLM Agent for Complex Task Solving
The most reliable AI agent framework that supports MCP
RAGFlow is an open-source RAG (Retrieval-Augmented Generation) engine
Python framework for AI workflows and pipelines with chain of thought
State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX
The Triton Inference Server provides an optimized cloud
An Efficient and Easy-to-use Federated Learning Framework
A Model Context Protocol Server for Home Assistant
An MCP server that provides fast file searching capabilities
Provides line-oriented text file editing capabilities
Airtable integration for AI-powered applications
A Model Context Protocol (MCP) server implementation
A Model Context Protocol (MCP) server implementation
Query MCP enables end-to-end management of Supabase via chat interface