Download Latest Version llama-b9174-bin-ubuntu-openvino-2026.0-x64.tar.gz (12.3 MB)
Email in envelope

Get an email when there's a new version of llama.cpp

Home / b9174
Name Modified Size InfoDownloads / Week
Parent folder
llama-b9174-xcframework.zip < 11 hours ago 202.6 MB
llama-b9174-bin-win-vulkan-x64.zip < 11 hours ago 32.7 MB
llama-b9174-bin-win-sycl-x64.zip < 11 hours ago 111.6 MB
llama-b9174-bin-win-opencl-adreno-arm64.zip < 11 hours ago 10.1 MB
llama-b9174-bin-win-hip-radeon-x64.zip < 11 hours ago 319.4 MB
llama-b9174-bin-win-cuda-13.1-x64.zip < 11 hours ago 137.1 MB
llama-b9174-bin-win-cuda-12.4-x64.zip < 11 hours ago 218.3 MB
llama-b9174-bin-win-cpu-x64.zip < 11 hours ago 15.9 MB
llama-b9174-bin-win-cpu-arm64.zip < 11 hours ago 9.5 MB
llama-b9174-bin-ubuntu-x64.tar.gz < 11 hours ago 14.0 MB
llama-b9174-bin-ubuntu-vulkan-x64.tar.gz < 11 hours ago 31.5 MB
llama-b9174-bin-ubuntu-vulkan-arm64.tar.gz < 11 hours ago 24.8 MB
llama-b9174-bin-ubuntu-sycl-fp32-x64.tar.gz < 11 hours ago 44.3 MB
llama-b9174-bin-ubuntu-sycl-fp16-x64.tar.gz < 11 hours ago 44.5 MB
llama-b9174-bin-ubuntu-s390x.tar.gz < 11 hours ago 12.5 MB
llama-b9174-bin-ubuntu-rocm-7.2-x64.tar.gz < 11 hours ago 129.3 MB
llama-b9174-bin-ubuntu-openvino-2026.0-x64.tar.gz < 11 hours ago 12.3 MB
llama-b9174-bin-ubuntu-arm64.tar.gz < 11 hours ago 11.0 MB
llama-b9174-bin-macos-x64.tar.gz < 11 hours ago 8.5 MB
llama-b9174-bin-macos-arm64.tar.gz < 11 hours ago 8.5 MB
llama-b9174-bin-macos-arm64-kleidiai.tar.gz < 11 hours ago 8.5 MB
llama-b9174-bin-android-arm64.tar.gz < 11 hours ago 65.2 MB
llama-b9174-bin-910b-openEuler-x86-aclgraph.tar.gz < 11 hours ago 11.6 MB
llama-b9174-bin-910b-openEuler-aarch64-aclgraph.tar.gz < 11 hours ago 10.9 MB
llama-b9174-bin-310p-openEuler-x86.tar.gz < 11 hours ago 11.6 MB
llama-b9174-bin-310p-openEuler-aarch64.tar.gz < 11 hours ago 10.9 MB
cudart-llama-bin-win-cuda-13.1-x64.zip < 11 hours ago 402.6 MB
cudart-llama-bin-win-cuda-12.4-x64.zip < 11 hours ago 391.4 MB
b9174 source code.tar.gz < 13 hours ago 33.9 MB
b9174 source code.zip < 13 hours ago 35.3 MB
README.md < 13 hours ago 7.2 kB
Totals: 31 Items   2.4 GB 0
ui: Restructure repo to use `tools/ui` folder and `ui` / `UI` / `llama-ui` / `LLAMA_UI` naming (#23064) * webui: Move static build output from `tools/server/public` to `build/ui` directory * refactor: Move to `tools/ui` * refactor: rename CMake variables and preprocessor defines - Rename LLAMA_BUILD_WEBUI -> LLAMA_BUILD_UI (old kept as deprecated) - Rename LLAMA_USE_PREBUILT_WEBUI -> LLAMA_USE_PREBUILT_UI (old kept as deprecated) - Backward compat: old vars auto-forward to new ones with DEPRECATION warning - Rename internal vars: WEBUI_SOURCE -> UI_SOURCE, WEBUI_SOURCE_DIR -> UI_SOURCE_DIR, etc. - Rename HF bucket: LLAMA_WEBUI_HF_BUCKET -> LLAMA_UI_HF_BUCKET - Emit both LLAMA_BUILD_WEBUI and LLAMA_BUILD_UI preprocessor defines - Emit both LLAMA_WEBUI_DEFAULT_ENABLED and LLAMA_UI_DEFAULT_ENABLED * refactor: rename CLI flags (--webui -> --ui) with backward compat - Add --ui/--no-ui (old --webui/--no-webui kept as deprecated aliases) - Add --ui-config (old --webui-config kept as deprecated alias) - Add --ui-config-file (old --webui-config-file kept as deprecated alias) - Add --ui-mcp-proxy/--no-ui-mcp-proxy (old --webui-mcp-proxy kept as deprecated) - Add new env vars: LLAMA_ARG_UI, LLAMA_ARG_UI_CONFIG, LLAMA_ARG_UI_CONFIG_FILE, LLAMA_ARG_UI_MCP_PROXY - C++ struct fields: params.ui, params.ui_config_json, params.ui_mcp_proxy added alongside old fields - Backward compat: old fields synced to new ones in g_params_to_internals * refactor: update C++ server internals with backward compat - Rename json_webui_settings -> json_ui_settings (both kept in server_context_meta) - Rename params.webui usage -> params.ui (both synced, old still works) - JSON API emits both "ui"/"ui_settings" and "webui"/"webui_settings" keys - Server routes use params.ui_mcp_proxy || params.webui_mcp_proxy - Preprocessor guards use #if defined(LLAMA_BUILD_UI) || defined(LLAMA_BUILD_WEBUI) * refactor: rename CI/CD workflows, artifacts, and build script - Rename webui-build.yml -> ui-build.yml; artifact webui-build -> ui-build - Rename webui-publish.yml -> ui-publish.yml; var HF_BUCKET_WEBUI_STATIC_OUTPUT -> HF_BUCKET_UI_STATIC_OUTPUT - Rename server-webui.yml -> server-ui.yml; job webui-build/checks -> ui-build/checks - Update server.yml: job/artifact refs webui-build -> ui-build - Update release.yml: all webui-build/publish refs -> ui-build/publish; HF_TOKEN_WEBUI_STATIC_OUTPUT -> HF_TOKEN_UI_STATIC_OUTPUT - Update server-self-hosted.yml: webui-build -> ui-build - Update build-self-hosted.yml: HF_WEBUI_VERSION -> HF_UI_VERSION - Rename webui-download.cmake -> ui-download.cmake (internal refs updated) - Update labeler.yml: server/webui -> server/ui path label * docs: update CODEOWNERS and server README docs - Update CODEOWNERS: team ggml-org/llama-webui -> ggml-org/llama-ui, path /tools/server/webui/ -> /tools/ui/ - Update server README.md: CLI tables show --ui flags with deprecated --webui aliases - Update server README-dev.md: "WebUI" -> "UI", paths updated to tools/ui/ * fix: Small fixes for UI build * fix: CMake.txt syntax * chore: Formatting * fix: `.editorconfig` for llama-ui * chore: Formatting * refactor: Use `APP_NAME` in Error route * refactor: Cleanup * refactor: Single migration service * make llama-ui a linkable target * fix: UI Build output * fix: Missing change * fix: separate llama-ui npm build output into build/tools/ui/dist subfolder + use cmake npm build instead of downloading ui-build.yml artifacts in CI * refactor: UI workflows cleanup --------- Co-authored-by: Xuan Son Nguyen <son@huggingface.co>

macOS/iOS:

Linux:

Android:

Windows:

openEuler:

Source: README.md, updated 2026-05-16