ConfiChat is a lightweight, privacy-focused chat interface designed to give users full control over their interaction with large language models across both local and cloud providers. Built as a standalone application without heavy dependencies, it emphasizes simplicity, portability, and ease of setup across multiple platforms including desktop and mobile environments. The tool supports local models such as Ollama and llama.cpp for fully offline operation, while also allowing integration with cloud APIs like OpenAI and Anthropic for access to more advanced capabilities. A key differentiator is its optional encryption of chat history and assets, ensuring that sensitive data can remain secure even when stored locally. Conversations are managed as local JSON files, giving users transparency and direct control over their data. Overall, ConfiChat is designed for users who prioritize privacy, flexibility, and independence from complex infrastructure while still maintaining access.
Features
- Offline-first support with local model providers
- Optional encryption for chat history and assets
- Cross-platform compatibility including mobile
- Integration with cloud AI APIs
- Lightweight standalone architecture
- Local storage using JSON-based session management