| Name | Modified | Size | Downloads / Week |
|---|---|---|---|
| Parent folder | |||
| README.md | 2025-11-12 | 3.3 kB | |
| v1.79.3.dev5 source code.tar.gz | 2025-11-12 | 217.6 MB | |
| v1.79.3.dev5 source code.zip | 2025-11-12 | 220.4 MB | |
| Totals: 3 Items | 438.0 MB | 0 | |
What's Changed
- [Infra] CI/CD - Bump up docker version for e2e ui testing by @yuneng-jiang in https://github.com/BerriAI/litellm/pull/16506
- Add Zscaler AI Guard hook by @jwang-gif in https://github.com/BerriAI/litellm/pull/15691
- [Feature] UI - Add Tags To Edit Key Flow by @yuneng-jiang in https://github.com/BerriAI/litellm/pull/16500
- fix: remove enterprise restriction from guardrails list endpoint by @pazevedo-hyland in https://github.com/BerriAI/litellm/pull/15333
- [Feat] New Provider - Add RunwayML Provider for video generations by @ishaan-jaff in https://github.com/BerriAI/litellm/pull/16505
- [Feature] UI - Improve Usage Indicator by @yuneng-jiang in https://github.com/BerriAI/litellm/pull/16504
- [Feature] UI - Add LiteLLM Params to Edit Model by @yuneng-jiang in https://github.com/BerriAI/litellm/pull/16496
- [Fix] Litellm tags usage add request_id by @yuneng-jiang in https://github.com/BerriAI/litellm/pull/16111
- [Fix] Use user budget instead of key budget when creating new team by @yuneng-jiang in https://github.com/BerriAI/litellm/pull/16074
- Documentation Code Example corrections by @AnthonyMonaco in https://github.com/BerriAI/litellm/pull/16502
- fix: Sanitize null token usage in OpenAI-compatible responses by @AlanPonnachan in https://github.com/BerriAI/litellm/pull/16493
- fix: add new models, delete repeat models, update pricing. by @mcowger in https://github.com/BerriAI/litellm/pull/16491
- Add atexit handlers to flush callbacks for async completions by @andrewm4894 in https://github.com/BerriAI/litellm/pull/16487
- Update model logging format for custom LLM provider by @f14-bertolotti in https://github.com/BerriAI/litellm/pull/16485
- Vertex ai Rerank safe load vertex ai creds by @Sameerlite in https://github.com/BerriAI/litellm/pull/16479
- fix(agentcore): Convert SSE stream iterator to async for proper streaming support by @busla in https://github.com/BerriAI/litellm/pull/16293
- fix: use vllm passthrough config for hosted vllm provider instead of raising error by @MightyGoldenOctopus in https://github.com/BerriAI/litellm/pull/16537
- fix: app_roles missing from jwt payload by @mubashir1osmani in https://github.com/BerriAI/litellm/pull/16448
- [Fix] - Bedrock Knowledge Bases - add support for filtering kb queries by @ishaan-jaff in https://github.com/BerriAI/litellm/pull/16543
- [Fix] Bedrock Embeddings - Ensure correct
aws_regionis used when provided dynamically by @ishaan-jaff in https://github.com/BerriAI/litellm/pull/16547 - docs: update broken Slack invite links to support page by @Chesars in https://github.com/BerriAI/litellm/pull/16546
New Contributors
- @jwang-gif made their first contribution in https://github.com/BerriAI/litellm/pull/15691
- @AnthonyMonaco made their first contribution in https://github.com/BerriAI/litellm/pull/16502
- @andrewm4894 made their first contribution in https://github.com/BerriAI/litellm/pull/16487
- @f14-bertolotti made their first contribution in https://github.com/BerriAI/litellm/pull/16485
- @busla made their first contribution in https://github.com/BerriAI/litellm/pull/16293
- @MightyGoldenOctopus made their first contribution in https://github.com/BerriAI/litellm/pull/16537
Full Changelog: https://github.com/BerriAI/litellm/compare/v1.79.dev.1...v1.79.3.dev5