Run models like Kimi-K2.5, GLM-5, DeepSeek, gpt-oss, Gemma, Qwen etc.
No need for Termux, you can start the Ollama service
A single-file tkinter-based Ollama GUI project
Proxy that allows you to use ollama as a copilot like Github copilot
Raycast extention for Ollama
LLM plugin providing access to models running on an Ollama server
Modern, Header-only C++ bindings for the Ollama API
A simple and easy-to-use library for interacting with the Ollama API
Implement CPU from scratch and play with large model deployments
Ollama Python library
Ollama Telegram bot, with advanced configuration
A Swift client library for interacting with Ollama
Ollama-Laravel is a Laravel package providing seamless integration
Discord Bot that utilizes Ollama to interact with any LLMs
A multi-platform desktop application to evaluate and compare LLM
Chat with multiple PDFs locally
Ollama JavaScript library
The New Windows Terminal
Bringing the awesome Git SCM to Windows
A node.js version management utility for Windows written in Go
Fully-featured WireGuard client for Windows
Chromium fork for Windows named after radioactive element No. 90
A simple yet powerful calculator that ships with Windows
Windows inside a Docker container
Wrap the Ollama API, which allows you to run different LLMs