Clippy is an open-source desktop assistant that allows users to run modern large language models locally while presenting them through a nostalgic interface inspired by Microsoft’s classic Clippy assistant from the 1990s. The project serves as both a playful homage to the early days of personal computing and a practical demonstration of local AI inference. Clippy integrates with the llama.cpp runtime to run models directly on a user’s computer without requiring cloud-based AI services. It supports models in the GGUF format, which allows it to run many publicly available open-source LLMs efficiently on consumer hardware. Users interact with the system through a simple animated assistant interface that can answer questions, generate text, and perform conversational tasks. The application includes one-click installation support for several popular models such as Meta’s Llama, Google’s Gemma, and other open models.
Features
- Local execution of large language models using llama.cpp
- Support for GGUF model formats used by many open LLMs
- Retro desktop assistant interface inspired by Microsoft Clippy
- One-click installation for several popular open-source models
- Offline AI interaction without reliance on cloud services
- Lightweight desktop application focused on simplicity and privacy