Browse free open source LLM Inference tools and projects for BSD below. Use the toggles on the left to filter open source LLM Inference tools by OS, license, language, programming language, and project status.
Port of OpenAI's Whisper model in C/C++
Port of Facebook's LLaMA model in C/C++
Run Local LLMs on Any Device. Open-source
User-friendly AI Interface
Run local LLMs like llama, deepseek, kokoro etc. inside your browser
Implementation of model parallel autoregressive transformers on GPUs
Deep learning optimization library: makes distributed training easy
Phi-3.5 for Mac: Locally-run Vision and Language Models
Superduper: Integrate AI models and machine learning workflows
A real time inference engine for temporal logical specifications
High-level Deep Learning Framework written in Kotlin
Official inference library for Mistral models
Framework that is dedicated to making neural data processing