An official Qdrant Model Context Protocol (MCP) server implementation
Browse the web, directly from Cursor etc.
Shell command execution server implementing the Model Context Protocol
Develop software autonomously
Optimizing inference proxy for LLMs
LMDeploy is a toolkit for compressing, deploying, and serving LLMs
A Modular Simulation Framework and Benchmark for Robot Learning
TextWorld is a sandbox learning environment for the training
This repos contains notebooks for the Advanced Solutions Lab
Pokee Deep Research Model Open Source Repo
Revolutionizes the way users interact with Autogen
Easy-to-use and powerful NLP library with Awesome model zoo
State-of-the-art diffusion models for image and audio generation
A lightweight vision library for performing large object detection
Tool for visualizing and tracking your machine learning experiments
A Unified Library for Parameter-Efficient Learning
Scripts for fine-tuning Meta Llama3 with composable FSDP & PEFT method
A Powerful Native Multimodal Model for Image Generation
Python framework for AI workflows and pipelines with chain of thought
PyTorch extensions for fast R&D prototyping and Kaggle farming
The Triton Inference Server provides an optimized cloud
Unified Model Serving Framework
A Python library for audio
An open-source, low-code machine learning library in Python
A high-performance ML model serving framework, offers dynamic batching