Browse free open source Python LLM Inference Tools and projects below. Use the toggles on the left to filter open source Python LLM Inference Tools by OS, license, language, programming language, and project status.
Run Local LLMs on Any Device. Open-source
Ready-to-use OCR with 80+ supported languages
Lightweight anchor-free object detection model
A high-throughput and memory-efficient inference and serving engine
Library for OCR-related tasks powered by Deep Learning
A set of Docker images for training and serving models in TensorFlow
Implementation of model parallel autoregressive transformers on GPUs
Data manipulation and transformation for audio signal processing
The official Python client for the Huggingface Hub
Uncover insights, surface problems, monitor, and fine tune your LLM
State-of-the-art diffusion models for image and audio generation
A lightweight vision library for performing large object detection
Replace OpenAI GPT with another LLM in your app
Official inference library for Mistral models
Optimizing inference proxy for LLMs
AIMET is a library that provides advanced quantization and compression
Deep learning optimization library: makes distributed training easy
Open platform for training, serving, and evaluating language models
LMDeploy is a toolkit for compressing, deploying, and serving LLMs
Easiest and laziest way for building multi-agent LLMs applications
OpenMMLab Video Perception Toolbox
Bring the notion of Model-as-a-Service to life
Lightweight Python library for adding real-time multi-object tracking
Trainable, memory-efficient, and GPU-friendly PyTorch reproduction
Operating LLMs in production