| Name | Modified | Size | Downloads / Week |
|---|---|---|---|
| StudioOllamaUI_v1.2_ESP_ENG.7z | < 19 hours ago | 1.9 GB | |
| CHANGELOG.md | < 19 hours ago | 3.3 kB | |
| StudioOllamaUI_v1.1_ESP_ENG.7z | < 24 hours ago | 1.9 GB | |
| README_EN.md | 2026-01-30 | 1.5 kB | |
| README.md | 2026-01-30 | 1.7 kB | |
| StudioOllamaUI_1.0_spanish.7z | 2026-01-30 | 1.9 GB | |
| Totals: 6 Items | 5.7 GB | 2 |
StudioOllamaUI
Portable graphical interface for Ollama, focused on privacy, ease of use, and local execution.
Description
StudioOllamaUI is a fully portable application that allows interaction with LLM models through Ollama without complex installations. It is designed to run locally, preserve user privacy, and work from any folder or USB device.
Features
- 100% portable application.
- No writes outside the root directory.
- Persistent chat history, models, and API keys.
- Support for local and cloud models.
- File system reading support.
- No Docker, Node.js, Python, or external installs required.
- Free and freely distributable.
System requirements
- Windows 10 / 11 (64-bit).
- Minimum 4 GB RAM (quantized or cloud models).
- Automatic GPU usage if VRAM is available.
- At least 5 GB of free disk space.
Getting started
- Run
StudioOllamaUI.exe. - Three terminal windows will open (do not close them).
- Default model is
qwen3:0.6b. - Manage models using the orange button.
- Closing Firefox will stop all servers automatically.
API Keys
- Ollama API (free): required for cloud models.
- Tavily API (free, recommended).
- Google / Bing APIs (paid).
- DuckDuckGo does not require an API key.
Security
- Local execution only.
- Internet access only when web search is enabled.
- Does not execute commands automatically.
- No user data is collected.
License
GPL-3.0
Credits
Developer: Francesc Roig
Contact: francesc@vivaldi.net
GitHub: https://github.com/francescroig/StudioOllamaUI