| Name | Modified | Size | Downloads / Week |
|---|---|---|---|
| Parent folder | |||
| README.md | < 17 hours ago | 14.8 kB | |
| v1.84.0 source code.tar.gz | < 17 hours ago | 42.6 MB | |
| v1.84.0 source code.zip | < 17 hours ago | 46.7 MB | |
| Totals: 3 Items | 89.4 MB | 0 | |
⚠️ Heads up — this release contains breaking changes. Read the full release notes here: v1.84.0 release notes
Verify Docker Image Signature
All LiteLLM Docker images are signed with cosign. Every release is signed with the same key introduced in commit 0112e53.
Verify using the pinned commit hash (recommended):
A commit hash is cryptographically immutable, so this is the strongest way to ensure you are using the original signing key:
:::bash
cosign verify \
--key https://raw.githubusercontent.com/BerriAI/litellm/0112e53046018d726492c814b3644b7d376029d0/cosign.pub \
ghcr.io/berriai/litellm:v1.84.0
Verify using the release tag (convenience):
Tags are protected in this repository and resolve to the same key. This option is easier to read but relies on tag protection rules:
:::bash
cosign verify \
--key https://raw.githubusercontent.com/BerriAI/litellm/v1.84.0/cosign.pub \
ghcr.io/berriai/litellm:v1.84.0
Expected output:
The following checks were performed on each of these signatures:
- The cosign claims were validated
- The signatures were verified against the specified public key
What's Changed
- merge main by @Sameerlite in https://github.com/BerriAI/litellm/pull/25521
- merge litellm_internal_staging by @Sameerlite in https://github.com/BerriAI/litellm/pull/25949
- merge main by @Sameerlite in https://github.com/BerriAI/litellm/pull/26304
- merge main by @Sameerlite in https://github.com/BerriAI/litellm/pull/26379
- merge main by @Sameerlite in https://github.com/BerriAI/litellm/pull/26437
- fix(redis): cache GCP IAM token to prevent async event loop blocking by @harish-berri in https://github.com/BerriAI/litellm/pull/26441
- litellm oss branch by @krrish-berri-2 in https://github.com/BerriAI/litellm/pull/26386
- fix noma v2 deepcopy crashing in build scan payload - new PR by @omriShukrun08 in https://github.com/BerriAI/litellm/pull/26605
- fix(ui): use stored-credentials endpoint for tools fetch on MCP edit page by @ryan-crabbe-berri in https://github.com/BerriAI/litellm/pull/26002
- feat(proxy): add --timeout_worker_healthcheck flag for uvicorn worker triage by @ryan-crabbe-berri in https://github.com/BerriAI/litellm/pull/26622
- fix(ci): support CircleCI rerun failed tests for local_testing jobs by @mateo-berri in https://github.com/BerriAI/litellm/pull/26461
- docs: update pull_request_template to add Linear ticket mentioning by @mateo-berri in https://github.com/BerriAI/litellm/pull/26655
- fix(pricing): GPT-5.5 Pro Pricing by @lmcdonald-godaddy in https://github.com/BerriAI/litellm/pull/26651
- feat(proxy): Add cleanup job for expired LiteLLM dashboard session keys by @milan-berri in https://github.com/BerriAI/litellm/pull/26460
- fix(ui): move 'Store Prompts in Spend Logs' toggle to Admin Settings by @ryan-crabbe-berri in https://github.com/BerriAI/litellm/pull/26631
- fix(caching): preserve prompt_tokens_details through embedding cache round-trip by @michelligabriele in https://github.com/BerriAI/litellm/pull/26653
- feat(logging): add retry settings for generic API logger by @milan-berri in https://github.com/BerriAI/litellm/pull/26645
- fix(logging): backfill streaming hidden response cost by @milan-berri in https://github.com/BerriAI/litellm/pull/26606
- fix(vertex-ai): reuse anthropic messages config instances by @Sameerlite in https://github.com/BerriAI/litellm/pull/26099
- fix(vertex): preserve items on array branches in anyOf with null + de-flake test by @yuneng-berri in https://github.com/BerriAI/litellm/pull/26675
- fix(tests): replace deprecated Bedrock Claude 3.7 Sonnet model ID by @ryan-crabbe-berri in https://github.com/BerriAI/litellm/pull/26721
- [Fix] Cache LiteLLM_Config param reads in DualCache and batch by @Michael-RZ-Berri in https://github.com/BerriAI/litellm/pull/26469
- [Feat] Lazy-load optional feature routers on first request by @Michael-RZ-Berri in https://github.com/BerriAI/litellm/pull/26534
- [Fix] Unify cost calc in success_handler dict and typed branches by @Michael-RZ-Berri in https://github.com/BerriAI/litellm/pull/26629
- Revert "[Feat] Lazy-load optional feature routers on first request" by @krrish-berri-2 in https://github.com/BerriAI/litellm/pull/26727
- [Infra] Version Bump by @yuneng-berri in https://github.com/BerriAI/litellm/pull/26728
- [Infra] Promote Internal Staging to main by @yuneng-berri in https://github.com/BerriAI/litellm/pull/26731
- ci(release): accept PEP 440 tag forms in create-release workflow by @yuneng-berri in https://github.com/BerriAI/litellm/pull/26734
- [Feat] Add gpt-image-2 support (#26644) by @ishaan-berri in https://github.com/BerriAI/litellm/pull/26705
- feat(provider): add AIHubMix as an OpenAI-compatible provider by @xinrui-z in https://github.com/BerriAI/litellm/pull/24294
- merge internal staging by @Sameerlite in https://github.com/BerriAI/litellm/pull/26737
- merge main by @Sameerlite in https://github.com/BerriAI/litellm/pull/26742
- merge mian by @Sameerlite in https://github.com/BerriAI/litellm/pull/26745
- fix(test): scope ERROR log assertion to LiteLLM logger in test_model_alias_map by @mateo-berri in https://github.com/BerriAI/litellm/pull/26741
- merge main by @Sameerlite in https://github.com/BerriAI/litellm/pull/26757
- fix(bedrock, anthropic): translate OpenAI file content on tool-result path by @minznerjosh in https://github.com/BerriAI/litellm/pull/26710
- remove /ui/chat page by @ishaan-berri in https://github.com/BerriAI/litellm/pull/26739
- fix: add optional TCP SO_KEEPALIVE support to aiohttp's TCPConnector by @yassinkortam in https://github.com/BerriAI/litellm/pull/26730
- feat(proxy): LiteLLM headers on Google native generateContent routes by @Sameerlite in https://github.com/BerriAI/litellm/pull/25500
- feat(vector-stores): support Bedrock retrievalConfiguration passthrough by @Sameerlite in https://github.com/BerriAI/litellm/pull/26685
- feat(mcp): opt-in short-ID tool prefix to keep MCP tool names under the 60-char limit by @mateo-berri in https://github.com/BerriAI/litellm/pull/26733
- fix(proxy): self-heal Prisma read paths + harden reconnect state machine by @yuneng-berri in https://github.com/BerriAI/litellm/pull/26756
- [Fix] Redact spend logs error message by @Michael-RZ-Berri in https://github.com/BerriAI/litellm/pull/26662
- [Feat]Add support for azure entra discovery endpoint by @Sameerlite in https://github.com/BerriAI/litellm/pull/26584
- [Fix] Proxy: reconnect Prisma DB without blocking the event loop by @yuneng-berri in https://github.com/BerriAI/litellm/pull/26225
- feat(proxy): durable agent workflow run tracking via /v1/workflows/runs by @ishaan-berri in https://github.com/BerriAI/litellm/pull/26793
- chore(auth): tighten clientside api_base handling by @stuxf in https://github.com/BerriAI/litellm/pull/26518
- fix: drop sensitive locals from re-raised error messages by @ryan-crabbe-berri in https://github.com/BerriAI/litellm/pull/26823
- test(vertex-batches): set is_redirect=False on mocked retrieve response by @yuneng-berri in https://github.com/BerriAI/litellm/pull/26844
- chore(vector-stores): redact credentials in list/info/update responses; gate update by per-store access by @stuxf in https://github.com/BerriAI/litellm/pull/26489
- chore(auth): substitute alias for master key on UserAPIKeyAuth by @stuxf in https://github.com/BerriAI/litellm/pull/26484
- fix(mcp): tighten public-route detection and OAuth2 fallback gating by @stuxf in https://github.com/BerriAI/litellm/pull/26463
- [Fix] Team member null budget fallback by @Michael-RZ-Berri in https://github.com/BerriAI/litellm/pull/26809
- fix(proxy): inherit caller identity in passthrough batch managed-object by @ryan-crabbe-berri in https://github.com/BerriAI/litellm/pull/26831
- fix(proxy/auth): tighten guardrail modification permission check by @ryan-crabbe-berri in https://github.com/BerriAI/litellm/pull/26821
- [Fix] CI/Tooling: Correct min-release-age value in .npmrc files by @yuneng-berri in https://github.com/BerriAI/litellm/pull/26850
- fix(bedrock): add 1-hour cache write pricing for Claude 4.5/4.6/4.7 (Global, US) by @mateo-berri in https://github.com/BerriAI/litellm/pull/26800
- feat(proxy): add team-level search provider credentials by @Sameerlite in https://github.com/BerriAI/litellm/pull/26691
- Litellm oss staging by @Sameerlite in https://github.com/BerriAI/litellm/pull/26759
- fix(proxy/batches): forward model to retrieve_batch for bedrock by @sruthi-sixt-26 in https://github.com/BerriAI/litellm/pull/26814
- fix(passthrough): track spend for interrupted Bedrock streams by @mateo-berri in https://github.com/BerriAI/litellm/pull/26719
- merge main by @Sameerlite in https://github.com/BerriAI/litellm/pull/26855
- Litellm oss staging by @Sameerlite in https://github.com/BerriAI/litellm/pull/26852
- chore(team): audit-log team-callback admin mutations by @stuxf in https://github.com/BerriAI/litellm/pull/26859
- chore(team): close authz bypass via the available-team check by @stuxf in https://github.com/BerriAI/litellm/pull/26854
- chore(auth): harden invite-link onboarding token flow by @stuxf in https://github.com/BerriAI/litellm/pull/26843
- chore(mcp): tighten OAuth root endpoint resolution by @stuxf in https://github.com/BerriAI/litellm/pull/26840
- fix: validate aws region name by @yassin-berriai in https://github.com/BerriAI/litellm/pull/26906
- fix: drop milvus dbName and partitionNames from MILVUS_OPTIONAL_PARAMS by @yassin-berriai in https://github.com/BerriAI/litellm/pull/26910
- [Feat / Fix] Lazy loaded imports, lazy loaded front page by @Michael-RZ-Berri in https://github.com/BerriAI/litellm/pull/26802
- chore(proxy): harden request control fields by @stuxf in https://github.com/BerriAI/litellm/pull/26862
- chore(proxy): block env callback refs in key metadata by @stuxf in https://github.com/BerriAI/litellm/pull/26851
- chore(mcp): encrypt user-scoped MCP credentials at rest by @stuxf in https://github.com/BerriAI/litellm/pull/26836
- chore(mcp): SSRF guard on OAuth metadata discovery follow-up fetches by @stuxf in https://github.com/BerriAI/litellm/pull/26849
- [Fix] Replace subprocess startup-import diff with static source scan by @Michael-RZ-Berri in https://github.com/BerriAI/litellm/pull/26934
- chore(passthrough): default auth=True and drop enterprise gate on the safe option by @stuxf in https://github.com/BerriAI/litellm/pull/26827
- chore(proxy): contain UI_LOGO_PATH / LITELLM_FAVICON_URL on unauthenticated asset endpoints by @stuxf in https://github.com/BerriAI/litellm/pull/26815
- chore(cli): tighten CLI SSO session flow by @stuxf in https://github.com/BerriAI/litellm/pull/26835
- [Test] Proxy E2E: Opt In To Client Mock Response For Model Access Tests by @yuneng-berri in https://github.com/BerriAI/litellm/pull/26941
- [Fix] Responses API: Omit Empty Body On DELETE by @yuneng-berri in https://github.com/BerriAI/litellm/pull/26949
- Run pre_call_hook on Google generateContent endpoints by @Michael-RZ-Berri in https://github.com/BerriAI/litellm/pull/26914
- [Fix] Refresh Redis TTL on counter writes, skip stale in-memory in Redis by @Michael-RZ-Berri in https://github.com/BerriAI/litellm/pull/26829
- Add pagination controls to model health status by @shivamrawat1 in https://github.com/BerriAI/litellm/pull/26826
- feat(vertex_ai): propagate metadata labels to embedding, Imagen, rerank by @Sameerlite in https://github.com/BerriAI/litellm/pull/25499
- fix(anthropic): json response_format + user tools non-streaming by @Sameerlite in https://github.com/BerriAI/litellm/pull/26222
- [Infra] Bump Versions by @yuneng-berri in https://github.com/BerriAI/litellm/pull/26961
- [Infra] Promote Internal Staging to main by @yuneng-berri in https://github.com/BerriAI/litellm/pull/26962
- [Infra] Promote internal staging to main by @yuneng-berri in https://github.com/BerriAI/litellm/pull/27245
- fix(proxy): point /metrics 401 at the opt-out flag by @yuneng-berri in https://github.com/BerriAI/litellm/pull/27505
- fix(proxy): bound budget reservation per request (backport of [#27509] to 1.84.0rc2) by @yuneng-berri in https://github.com/BerriAI/litellm/pull/27539
- build(packaging): relax core runtime pins to ranges (rc2 backport of [#27241]) by @yuneng-berri in https://github.com/BerriAI/litellm/pull/27545
- build(packaging): raise jinja2 floor to 3.1.6 (rc2 backport of [#27552]) by @yuneng-berri in https://github.com/BerriAI/litellm/pull/27554
- cherry-pick: OpenAPI MCP extra_headers (#27383) onto litellm_1.84.0rc2 by @milan-berri in https://github.com/BerriAI/litellm/pull/27768
- fix(proxy): always merge caller-supplied tags into request metadata by @yuneng-berri in https://github.com/BerriAI/litellm/pull/27789
- cherry-pick: reject bare str at file-input sinks (#27762) onto litellm_1.84.0rc2 by @yuneng-berri in https://github.com/BerriAI/litellm/pull/27794
- [Fix] Lazy feature loading under SERVER_ROOT_PATH returns 404 (backport of [#27812]) by @yuneng-berri in https://github.com/BerriAI/litellm/pull/27818
- [Fix] Backport /key/regenerate ownership-rebind + premium-gate guards (#27793) by @yuneng-berri in https://github.com/BerriAI/litellm/pull/27819
- fix(proxy): expose db status on public /health/readiness (backport [#27866]) by @yuneng-berri in https://github.com/BerriAI/litellm/pull/27868
- fix(ui): fetch version + debug flag from /health/readiness/details (backport [#27896]) by @yuneng-berri in https://github.com/BerriAI/litellm/pull/27899
- [Infra] Build UI by @yuneng-berri in https://github.com/BerriAI/litellm/pull/27901
- chore(proxy): backport [#27898] + [#27801] to 1.84.0rc2 by @yuneng-berri in https://github.com/BerriAI/litellm/pull/27902
- fix: backport [#27892] to litellm_1.84.0rc2 by @yuneng-berri in https://github.com/BerriAI/litellm/pull/27903
- fix: backport [#27878] to litellm_1.84.0rc2 by @yuneng-berri in https://github.com/BerriAI/litellm/pull/27904
- chore: backport [#27908] to litellm_1.84.0rc2 by @yuneng-berri in https://github.com/BerriAI/litellm/pull/27909
New Contributors
- @omriShukrun08 made their first contribution in https://github.com/BerriAI/litellm/pull/26605
- @xinrui-z made their first contribution in https://github.com/BerriAI/litellm/pull/24294
- @minznerjosh made their first contribution in https://github.com/BerriAI/litellm/pull/26710
- @yassinkortam made their first contribution in https://github.com/BerriAI/litellm/pull/26730
- @sruthi-sixt-26 made their first contribution in https://github.com/BerriAI/litellm/pull/26814
Full Changelog: https://github.com/BerriAI/litellm/compare/v1.83.14-stable.patch.3...v1.84.0