Browse free open source LLM Inference tools and projects for Windows below. Use the toggles on the left to filter open source LLM Inference tools by OS, license, language, programming language, and project status.
Port of OpenAI's Whisper model in C/C++
Run Local LLMs on Any Device. Open-source
ONNX Runtime: cross-platform, high performance ML inferencing
User-friendly AI Interface
Port of Facebook's LLaMA model in C/C++
Ready-to-use OCR with 80+ supported languages
Lightweight anchor-free object detection model
Self-hosted, community-driven, local OpenAI compatible API
C++ library for high performance inference on NVIDIA GPUs
OpenVINO™ Toolkit repository
A high-throughput and memory-efficient inference and serving engine
Protect and discover secrets using Gitleaks
Open-Source AI Camera. Empower any camera/CCTV
The deep learning toolkit for speech-to-text
Library for OCR-related tasks powered by Deep Learning
High-performance neural network inference framework for mobile
Open standard for machine learning interoperability
Lightweight inference library for ONNX files, written in C++
Data manipulation and transformation for audio signal processing
An Open-Source Programming Framework for Agentic AI
Implementation of model parallel autoregressive transformers on GPUs
Uncover insights, surface problems, monitor, and fine tune your LLM
Fast inference engine for Transformer models
State-of-the-art diffusion models for image and audio generation
A lightweight vision library for performing large object detection