Ollama Copilot is a proxy-based tool that transforms locally hosted language models into a GitHub Copilot-style coding assistant for popular development environments. It acts as an intermediary server that exposes Ollama or other model providers through a Copilot-compatible interface, allowing developers to use local or self-hosted models for inline code completion. The project supports multiple providers such as Ollama, DeepSeek, and Mistral, enabling flexibility between local and remote inference depending on user needs. It integrates with editors like Neovim, VS Code, Zed, and Emacs by redirecting Copilot traffic through a configurable proxy layer. The system allows customization of parameters such as context size, token prediction limits, and prompt templates, which gives developers granular control over how completions are generated. It also supports secure connections through TLS configuration and can be deployed as a background service for continuous availability.

Features

  • Copilot-style proxy for local or remote LLMs
  • Integration with multiple IDEs and editors
  • Support for multiple providers including Ollama and DeepSeek
  • Customizable prompt templates and system prompts
  • Configurable context window and token limits
  • Optional background service with TLS support

Project Samples

Project Activity

See All Activity >

Categories

Code Editors

License

MIT License

Follow Ollama Copilot

Ollama Copilot Web Site

Other Useful Business Software
Try Google Cloud Risk-Free With $300 in Credit Icon
Try Google Cloud Risk-Free With $300 in Credit

No hidden charges. No surprise bills. Cancel anytime.

Use your credit across every product. Compute, storage, AI, analytics. When it runs out, 20+ products stay free. You only pay when you choose to.
Start Free
Rate This Project
Login To Rate This Project

User Reviews

Be the first to post a review of Ollama Copilot!

Additional Project Details

Operating Systems

Windows

Programming Language

Go

Related Categories

Go Code Editors

Registered

2026-04-20