Port of Facebook's LLaMA model in C/C++
Python bindings for llama.cpp
Maid is a cross-platform Flutter app for interfacing with GGUF
C#/.NET binding of llama.cpp, including LLaMa/GPT model inference
WebAssembly binding for llama.cpp - Enabling on-browser LLM inference
Run Local LLMs on Any Device. Open-source
Personal AI, On Personal Devices
Clippy, now with some AI
The free, Open Source alternative to OpenAI, Claude and others
Interface for OuteTTS models
Open-source LLM load balancer and serving platform for hosting LLMs
Your Personal AI Assistant; easy to install, deploy on local or coud
Distribute and run LLMs with a single file
Claude Code, but it runs on your Mac for free
Run AI models locally on your machine with node.js bindings for llama
React and Electron-based app that executes the FreedomGPT LLM locally
QVAC Fabric: cross-platform LLM inference and fine-tuning
Vim plugin for LLM-assisted code/text completion
Run a full local LLM stack with one command using Docker
VS Code extension for LLM-assisted code/text completion
Inference Llama 2 in one file of pure C
DevoxxGenie is a plugin for IntelliJ IDEA that uses local LLM's
An easy-to-understand framework for LLM samplers
Powerful Android AI agent with tools, automation, and Linux shell