Showing 137 open source projects for "ollama"

View related business solutions
  • Go From AI Idea to AI App Fast Icon
    Go From AI Idea to AI App Fast

    One platform to build, fine-tune, and deploy ML models. No MLOps team required.

    Access Gemini 3 and 200+ models. Build chatbots, agents, or custom models with built-in monitoring and scaling.
    Try Free
  • Forever Free Full-Stack Observability | Grafana Cloud Icon
    Forever Free Full-Stack Observability | Grafana Cloud

    Our generous forever free tier includes the full platform, including the AI Assistant, for 3 users with 10k metrics, 50GB logs, and 50GB traces.

    Built on open standards like Prometheus and OpenTelemetry, Grafana Cloud includes Kubernetes Monitoring, Application Observability, Incident Response, plus the AI-powered Grafana Assistant. Get started with our generous free tier today.
    Create free account
  • 1
    Ollama

    Ollama

    Run models like Kimi-K2.5, GLM-5, DeepSeek, gpt-oss, Gemma, Qwen etc.

    Ollama also integrates with popular developer tools and AI agents, allowing seamless workflows across coding environments and applications. It supports REST APIs, Python, and JavaScript SDKs, making it easy to build AI-powered features into software projects. Overall, Ollama focuses on privacy, local-first AI execution, and developer-friendly tooling for building with open models.
    Downloads: 407 This Week
    Last Update:
    See Project
  • 2
    Ollama-GUI

    Ollama-GUI

    A single-file tkinter-based Ollama GUI project

    Ollama GUI by chyok is a minimalist desktop-style interface built to simplify interaction with local Ollama models through a graphical environment rather than the command line. It is implemented as a lightweight single-file application using Python and Tkinter, which means it avoids heavy dependencies and can run with minimal setup on most systems.
    Downloads: 4 This Week
    Last Update:
    See Project
  • 3
    Raycast Ollama

    Raycast Ollama

    Raycast extention for Ollama

    Raycast Ollama is an extension for Raycast that integrates Ollama-based large language models directly into the macOS productivity launcher environment. It allows users to interact with local AI models through Raycast commands, enabling quick access to chat, text generation, and other AI-powered tasks without leaving their workflow. The extension is designed to be lightweight and fast, aligning with Raycast’s philosophy of keyboard-driven productivity.
    Downloads: 1 This Week
    Last Update:
    See Project
  • 4
    llm-ollama

    llm-ollama

    LLM plugin providing access to models running on an Ollama server

    llm-ollama is a plugin for the LLM CLI ecosystem that enables seamless access to models hosted on an Ollama server through a unified command-line interface. It automatically discovers available models from the connected Ollama instance and registers them for use within the CLI, making it easy to run prompts, chat sessions, and embedding operations without manual configuration.
    Downloads: 1 This Week
    Last Update:
    See Project
  • AI-powered service management for IT and enterprise teams Icon
    AI-powered service management for IT and enterprise teams

    Enterprise-grade ITSM, for every business

    Give your IT, operations, and business teams the ability to deliver exceptional services—without the complexity. Maximize operational efficiency with refreshingly simple, AI-powered Freshservice.
    Try it Free
  • 5
    ollama-hpp

    ollama-hpp

    Modern, Header-only C++ bindings for the Ollama API

    ollama-hpp is a C++ client library that provides a lightweight and efficient interface for interacting with the Ollama API in native applications. It is designed with performance and portability in mind, making it suitable for systems where low-level control and minimal overhead are important. The library exposes core Ollama functionality such as text generation and chat through a C++-friendly API, allowing developers to integrate local LLM capabilities into desktop, embedded, or high-performance applications. ...
    Downloads: 1 This Week
    Last Update:
    See Project
  • 6
    Ollama-rs

    Ollama-rs

    A simple and easy-to-use library for interacting with the Ollama API

    Ollama-rs is a Rust library designed to provide a simple and efficient interface for interacting with the Ollama API, enabling developers to integrate local large language models into Rust applications. It follows the official Ollama API closely, ensuring compatibility while offering an idiomatic Rust experience with strong typing and asynchronous execution.
    Downloads: 1 This Week
    Last Update:
    See Project
  • 7
    Ollama Copilot

    Ollama Copilot

    Proxy that allows you to use ollama as a copilot like Github copilot

    Ollama Copilot is a proxy-based tool that transforms locally hosted language models into a GitHub Copilot-style coding assistant for popular development environments. It acts as an intermediary server that exposes Ollama or other model providers through a Copilot-compatible interface, allowing developers to use local or self-hosted models for inline code completion.
    Downloads: 1 This Week
    Last Update:
    See Project
  • 8
    Ollama Server

    Ollama Server

    No need for Termux, you can start the Ollama service

    Ollama Server is a mobile-first solution that brings the full Ollama runtime experience to Android devices through a simplified, one-click deployment model. Instead of relying on terminal environments like Termux, it provides a native application that launches and manages an Ollama-compatible service directly on a phone or tablet. The system exposes the same API behavior as standard Ollama installations, meaning any compatible client or integration can interact with it without modification. ...
    Downloads: 0 This Week
    Last Update:
    See Project
  • 9
    handy-ollama

    handy-ollama

    Implement CPU from scratch and play with large model deployments

    handy-ollama is an open-source educational project designed to help developers and AI enthusiasts learn how to deploy and run large language models locally using the Ollama platform. The repository serves as a structured tutorial that explains how to install, configure, and use Ollama to run modern language models on personal hardware without requiring advanced infrastructure.
    Downloads: 0 This Week
    Last Update:
    See Project
  • MongoDB Atlas runs apps anywhere Icon
    MongoDB Atlas runs apps anywhere

    Deploy in 115+ regions with the modern database for every enterprise.

    MongoDB Atlas gives you the freedom to build and run modern applications anywhere—across AWS, Azure, and Google Cloud. With global availability in over 115 regions, Atlas lets you deploy close to your users, meet compliance needs, and scale with confidence across any geography.
    Start Free
  • 10
    Ollama Python

    Ollama Python

    Ollama Python library

    ollama-python is an open-source Python SDK that wraps the Ollama CLI, allowing seamless interaction with local large language models (LLMs) managed by Ollama. Developers use it to load models, send prompts, manage sessions, and stream responses directly from Python code. It simplifies integration of Ollama-based models into applications, supporting synchronous and streaming modes.
    Downloads: 0 This Week
    Last Update:
    See Project
  • 11
    Ollama Telegram Bot

    Ollama Telegram Bot

    Ollama Telegram bot, with advanced configuration

    ...Overall, ollama-telegram provides a lightweight and extensible solution for deploying personal or team-based AI assistants.
    Downloads: 2 This Week
    Last Update:
    See Project
  • 12
    Discord Ollama Integration

    Discord Ollama Integration

    Discord Bot that utilizes Ollama to interact with any LLMs

    Discord Ollama Integration is a TypeScript-based Discord bot that integrates directly with the Ollama runtime to provide conversational AI capabilities inside Discord servers. The project is designed to turn any Discord channel into an interactive AI assistant powered by locally hosted large language models, allowing users to chat, manage models, and control AI behavior through slash commands.
    Downloads: 1 This Week
    Last Update:
    See Project
  • 13
    Ollama Swift Client

    Ollama Swift Client

    A Swift client library for interacting with Ollama

    Ollama Swift Client is a native Swift client library that enables developers to interact with Ollama models directly from Apple platforms such as macOS, iOS, and iPadOS. It is designed to feel natural within the Swift ecosystem, using modern language features like async/await and strong typing to provide a clean and intuitive developer experience.
    Downloads: 1 This Week
    Last Update:
    See Project
  • 14
    Ollama-Laravel Package

    Ollama-Laravel Package

    Ollama-Laravel is a Laravel package providing seamless integration

    Ollama-Laravel Package is a PHP package designed to integrate local large language models into Laravel applications through a clean and idiomatic interface. It abstracts the Ollama API into a developer-friendly facade, allowing Laravel developers to interact with models using familiar patterns such as service containers, configuration files, and fluent method chaining.
    Downloads: 1 This Week
    Last Update:
    See Project
  • 15
    Ollama Grid Search

    Ollama Grid Search

    A multi-platform desktop application to evaluate and compare LLM

    Ollama Grid Search is a desktop application designed to automate the evaluation and comparison of large language models, prompts, and inference parameters in a structured and repeatable way. Instead of manually testing combinations, the tool performs grid search experiments by iterating across different models, prompt variations, and parameter configurations, allowing users to quickly identify optimal setups for specific tasks.
    Downloads: 1 This Week
    Last Update:
    See Project
  • 16
    Ollama JavaScript Library

    Ollama JavaScript Library

    Ollama JavaScript library

    Ollama JavaScript is the official JavaScript client for integrating Ollama into JS and TS applications with a lightweight, developer-friendly API. It is designed around the Ollama REST API, so it feels consistent with the platform while making common tasks easier to handle in application code. The library supports standard chat interactions, text generation, embeddings, and model management, which makes it useful for both simple chat interfaces and more advanced AI-powered workflows. ...
    Downloads: 1 This Week
    Last Update:
    See Project
  • 17
    Ollama RAG Chatbot

    Ollama RAG Chatbot

    Chat with multiple PDFs locally

    Ollama RAG Chatbot is a local-first retrieval chatbot project built to let users chat with the contents of multiple PDF documents through a simple interface. The project is framed as an experiment, but its setup and packaging make it approachable for practical local use as well. It supports running on a local machine or in Kaggle, which lowers the barrier for users who want to test RAG workflows without building everything from scratch.
    Downloads: 0 This Week
    Last Update:
    See Project
  • 18
    NextJS Ollama LLM UI

    NextJS Ollama LLM UI

    Fully-featured web interface for Ollama LLMs

    NextJS Ollama LLM UI is a web-based frontend interface built with Next.js to make interacting with Ollama-hosted large language models easy and fast. Its goal is to remove the complexity of setting up and managing UI components for local or offline LLM usage by providing a straightforward chat experience with support for responsive layouts, light and dark themes, and local chat history storage in the browser.
    Downloads: 0 This Week
    Last Update:
    See Project
  • 19
    OllamaSharp

    OllamaSharp

    The easiest way to use Ollama in .NET

    OllamaSharp is an open-source .NET library that provides strongly typed bindings for interacting with the Ollama API, making it easier for developers to integrate local large language models into C# and .NET applications. The project acts as a wrapper around the Ollama API, exposing all endpoints through asynchronous methods that allow developers to perform tasks such as generating text, creating embeddings, and managing models. It supports both local and remote Ollama instances, enabling developers to run AI models on their own hardware or connect to remote model servers. ...
    Downloads: 3 This Week
    Last Update:
    See Project
  • 20
    Ollamac

    Ollamac

    Mac app for Ollama

    Ollamac is an open-source native macOS application that provides a graphical interface for interacting with local large language models running through the Ollama inference framework. The project was created to simplify the process of using local AI models, which typically require command-line interaction, by offering a clean and intuitive desktop interface. Through this interface, users can run and chat with a variety of LLM models installed through Ollama directly on their own machines. ...
    Downloads: 0 This Week
    Last Update:
    See Project
  • 21
    rollama

    rollama

    Wrap the Ollama API, which allows you to run different LLMs

    ...The design mirrors familiar R workflows, allowing users to integrate AI capabilities into scripts, notebooks, and data pipelines with minimal friction. It also provides flexibility to extend functionality to any feature supported by the underlying Ollama API.
    Downloads: 1 This Week
    Last Update:
    See Project
  • 22
    Maid

    Maid

    Maid is a cross-platform Flutter app for interfacing with GGUF

    Maid is a cross-platform Flutter app for interfacing with GGUF / llama.cpp models locally, and with Ollama and OpenAI models remotely. Maid is a cross-platform free and open source application for interfacing with llama.cpp models locally, and remotely with Ollama, Mistral, Google Gemini and OpenAI models remotely. Maid supports Sillytavern character cards to allow you to interact with all your favorite characters. Maid supports downloading a curated list of Models in-app directly from hugging face.
    Downloads: 31 This Week
    Last Update:
    See Project
  • 23
    Open WebUI

    Open WebUI

    User-friendly AI Interface

    ...Additionally, Open WebUI offers a Progressive Web App (PWA) for mobile devices, providing offline access and a native app-like experience. The platform also includes a Model Builder, allowing users to create custom models from base Ollama models directly within the interface. With over 156,000 users, Open WebUI is a versatile solution for deploying and managing AI models in a secure, offline environment.
    Downloads: 130 This Week
    Last Update:
    See Project
  • 24
    Chipper

    Chipper

    AI interface for tinkerers (Ollama, Haystack RAG, Python)

    Chipper is an AI interface designed for tinkerers and developers, providing a platform to experiment with various AI models and techniques. It offers integration with tools like Ollama and Haystack for Retrieval-Augmented Generation (RAG), enabling users to build and test AI applications efficiently. Chipper supports Python and provides a modular architecture, allowing for customization and extension based on specific project requirements.
    Downloads: 0 This Week
    Last Update:
    See Project
  • 25
    DeepChat

    DeepChat

    A smart assistant that connects powerful AI to your personal world

    ...DeepChat is a powerful open-source AI chat platform providing a unified interface for interacting with various large language models. Whether you're using cloud APIs like OpenAI, Gemini, Anthropic, or locally deployed Ollama models, DeepChat delivers a smooth user experience. As a cross-platform AI assistant application, DeepChat not only supports basic chat functionality but also offers advanced features such as search enhancement, tool calling, and multimodal interaction, making AI capabilities more accessible and efficient.
    Downloads: 29 This Week
    Last Update:
    See Project
MongoDB Logo MongoDB