Run models like Kimi-K2.5, GLM-5, DeepSeek, gpt-oss, Gemma, Qwen etc.
No need for Termux, you can start the Ollama service
A single-file tkinter-based Ollama GUI project
Raycast extention for Ollama
LLM plugin providing access to models running on an Ollama server
Modern, Header-only C++ bindings for the Ollama API
A simple and easy-to-use library for interacting with the Ollama API
Proxy that allows you to use ollama as a copilot like Github copilot
Implement CPU from scratch and play with large model deployments
Ollama Python library
A Swift client library for interacting with Ollama
Ollama-Laravel is a Laravel package providing seamless integration
Ollama Telegram bot, with advanced configuration
Discord Bot that utilizes Ollama to interact with any LLMs
A multi-platform desktop application to evaluate and compare LLM
Chat with multiple PDFs locally
Ollama JavaScript library
Wrap the Ollama API, which allows you to run different LLMs
Fully-featured web interface for Ollama LLMs
Mac app for Ollama
The easiest way to use Ollama in .NET
User-friendly AI Interface
Maid is a cross-platform Flutter app for interfacing with GGUF
A simple Java library for interacting with Ollama server
AI interface for tinkerers (Ollama, Haystack RAG, Python)