| Name | Modified | Size | Downloads / Week |
|---|---|---|---|
| Parent folder | |||
| LocalAI-v3.12.1-checksums.txt | 2026-02-21 | 382 Bytes | |
| local-ai-launcher-linux.tar.xz | 2026-02-21 | 16.2 MB | |
| local-ai-v3.12.1-darwin-arm64 | 2026-02-21 | 78.1 MB | |
| local-ai-v3.12.1-linux-amd64 | 2026-02-21 | 78.8 MB | |
| local-ai-v3.12.1-linux-arm64 | 2026-02-21 | 76.1 MB | |
| LocalAI-v3.12.1-source.tar.gz | 2026-02-21 | 10.7 MB | |
| LocalAI.dmg | 2026-02-21 | 12.0 MB | |
| README.md | 2026-02-21 | 1.0 kB | |
| v3.12.1 source code.tar.gz | 2026-02-21 | 10.8 MB | |
| v3.12.1 source code.zip | 2026-02-21 | 11.2 MB | |
| Totals: 10 Items | 293.9 MB | 26 | |
This is a patch release to tag the new llama.cpp version which fixes incompatibilities with Qwen 3 coder.
What's Changed
Other Changes
- docs: :arrow_up: update docs version mudler/LocalAI by @localai-bot in https://github.com/mudler/LocalAI/pull/8611
- feat(traces): Add backend traces by @richiejp in https://github.com/mudler/LocalAI/pull/8609
- chore: :arrow_up: Update ggml-org/llama.cpp to
b908baf1825b1a89afef87b09e22c32af2ca6548by @localai-bot in https://github.com/mudler/LocalAI/pull/8612 - chore: drop bark.cpp leftovers from pipelines by @mudler in https://github.com/mudler/LocalAI/pull/8614
- fix: merge openresponses messages by @mudler in https://github.com/mudler/LocalAI/pull/8615
- chore: :arrow_up: Update ggml-org/llama.cpp to
ba3b9c8844aca35ecb40d31886686326f22d2214by @localai-bot in https://github.com/mudler/LocalAI/pull/8613
Full Changelog: https://github.com/mudler/LocalAI/compare/v3.12.0...v3.12.1