Name | Modified | Size | Downloads / Week |
---|---|---|---|
Parent folder | |||
Ollama.dmg | 2025-09-26 | 47.9 MB | |
OllamaSetup.exe | 2025-09-26 | 1.2 GB | |
ollama-linux-arm64.tgz | 2025-09-26 | 1.9 GB | |
ollama-linux-arm64-jetpack6.tgz | 2025-09-26 | 359.6 MB | |
ollama-linux-arm64-jetpack5.tgz | 2025-09-26 | 450.2 MB | |
ollama-darwin.tgz | 2025-09-26 | 25.1 MB | |
Ollama-darwin.zip | 2025-09-26 | 47.8 MB | |
ollama-linux-amd64-rocm.tgz | 2025-09-26 | 1.1 GB | |
ollama-linux-amd64.tgz | 2025-09-26 | 1.9 GB | |
ollama-windows-amd64-rocm.zip | 2025-09-26 | 257.5 MB | |
ollama-windows-amd64.zip | 2025-09-26 | 1.9 GB | |
ollama-windows-arm64.zip | 2025-09-26 | 22.1 MB | |
sha256sum.txt | 2025-09-26 | 1.1 kB | |
README.md | 2025-09-26 | 1.3 kB | |
v0.12.3 source code.tar.gz | 2025-09-26 | 11.0 MB | |
v0.12.3 source code.zip | 2025-09-26 | 11.4 MB | |
Totals: 16 Items | 9.3 GB | 131 |
New models
- DeepSeek-V3.1-Terminus: DeepSeek-V3.1-Terminus is a hybrid model that supports both thinking mode and non-thinking mode. It delivers more stable & reliable outputs across benchmarks compared to the previous version:
Run on Ollama's cloud:
ollama run deepseek-v3.1:671b-cloud
Run locally (requires 500GB+ of VRAM)
ollama run deepseek-v3.1
- Kimi-K2-Instruct-0905: Kimi K2-Instruct-0905 is the latest, most capable version of Kimi K2. It is a state-of-the-art mixture-of-experts (MoE) language model, featuring 32 billion activated parameters and a total of 1 trillion parameters.
ollama run kimi-k2:1t-cloud
What's Changed
- Fixed issue where tool calls provided as stringified JSON would not be parsed correctly
ollama push
will now provide a URL to follow to sign in- Fixed issues where qwen3-coder would output unicode characters incorrectly
- Fix issue where loading a model with
/load
would crash
New Contributors
- @gr4ceG made their first contribution in https://github.com/ollama/ollama/pull/12385
Full Changelog: https://github.com/ollama/ollama/compare/v0.12.2...v0.12.3