Run models like Kimi-K2.5, GLM-5, DeepSeek, gpt-oss, Gemma, Qwen etc.
No need for Termux, you can start the Ollama service
Proxy that allows you to use ollama as a copilot like Github copilot
LLM plugin providing access to models running on an Ollama server
Modern, Header-only C++ bindings for the Ollama API
Ollama Python library
Raycast extention for Ollama
A simple and easy-to-use library for interacting with the Ollama API
A single-file tkinter-based Ollama GUI project
Implement CPU from scratch and play with large model deployments
A multi-platform desktop application to evaluate and compare LLM
A Swift client library for interacting with Ollama
Ollama-Laravel is a Laravel package providing seamless integration
Ollama Telegram bot, with advanced configuration
Discord Bot that utilizes Ollama to interact with any LLMs
Chat with multiple PDFs locally
Ollama JavaScript library
Fully-featured web interface for Ollama LLMs
Mac app for Ollama
Wrap the Ollama API, which allows you to run different LLMs
Maid is a cross-platform Flutter app for interfacing with GGUF
User-friendly AI Interface
The easiest way to use Ollama in .NET
A simple Java library for interacting with Ollama server
the terminal client for Ollama