Run models like Kimi-K2.5, GLM-5, DeepSeek, gpt-oss, Gemma, Qwen etc.
No need for Termux, you can start the Ollama service
Proxy that allows you to use ollama as a copilot like Github copilot
Raycast extention for Ollama
LLM plugin providing access to models running on an Ollama server
Modern, Header-only C++ bindings for the Ollama API
A simple and easy-to-use library for interacting with the Ollama API
A single-file tkinter-based Ollama GUI project
Implement CPU from scratch and play with large model deployments
Ollama Python library
Ollama Telegram bot, with advanced configuration
A Swift client library for interacting with Ollama
Ollama-Laravel is a Laravel package providing seamless integration
Chat with multiple PDFs locally
Discord Bot that utilizes Ollama to interact with any LLMs
A multi-platform desktop application to evaluate and compare LLM
Ollama JavaScript library
Fully-featured web interface for Ollama LLMs
Wrap the Ollama API, which allows you to run different LLMs
A simple Java library for interacting with Ollama server
Mac app for Ollama
Maid is a cross-platform Flutter app for interfacing with GGUF
User-friendly AI Interface
The easiest way to use Ollama in .NET
AI interface for tinkerers (Ollama, Haystack RAG, Python)