ChatOllama is an open-source chatbot platform built with Nuxt 3 and designed to provide a private, extensible interface for working with multiple modern language model providers. It goes beyond a basic chat UI by supporting a broad model ecosystem that includes OpenAI, Azure OpenAI, Anthropic, Google Gemini, Groq, Moonshot, Ollama, and other OpenAI-compatible services. The platform also includes higher-level capabilities such as AI agents, document-backed knowledge bases, real-time voice chat, and Model Context Protocol integration for external tools. Its RAG functionality allows document upload and knowledge-base-driven interaction, while vector database support adds more scalable retrieval options. Deployment is streamlined with Docker Compose, and the project also includes internationalization and modular feature toggles for controlling what parts of the system are enabled. As a result, ChatOllama feels less like a single chatbot and more like a flexible self-hosted AI workspace.
Features
- Multi-provider model support
- AI agents with tool access
- Knowledge bases with RAG document upload
- Real-time voice chat
- MCP-based tool integration
- Docker-ready deployment