Browse free open source Python LLM Inference Tools and projects below. Use the toggles on the left to filter open source Python LLM Inference Tools by OS, license, language, programming language, and project status.
Run Local LLMs on Any Device. Open-source
A high-throughput and memory-efficient inference and serving engine
Ready-to-use OCR with 80+ supported languages
LMDeploy is a toolkit for compressing, deploying, and serving LLMs
Uncover insights, surface problems, monitor, and fine tune your LLM
Deep learning optimization library: makes distributed training easy
Lightweight anchor-free object detection model
Library for OCR-related tasks powered by Deep Learning
Replace OpenAI GPT with another LLM in your app
Low-latency REST API for serving text-embeddings
LLM training code for MosaicML foundation models
Integrate, train and manage any AI models and APIs with your database
Database system for building simpler and faster AI-powered application
GPU environment management and cluster orchestration
Operating LLMs in production
Everything you need to build state-of-the-art foundation models
An easy-to-use LLMs quantization package with user-friendly apis
Sparsity-aware deep learning inference runtime for CPUs
A Pythonic framework to simplify AI service building
Bring the notion of Model-as-a-Service to life
Create HTML profiling reports from pandas DataFrame objects
Phi-3.5 for Mac: Locally-run Vision and Language Models
An MLOps framework to package, deploy, monitor and manage models
Superduper: Integrate AI models and machine learning workflows
Large Language Model Text Generation Inference