Name | Modified | Size | Downloads / Week |
---|---|---|---|
Parent folder | |||
README.md | < 7 hours ago | 5.1 kB | |
v1.78.0-nightly source code.tar.gz | < 7 hours ago | 199.5 MB | |
v1.78.0-nightly source code.zip | < 7 hours ago | 202.0 MB | |
Totals: 3 Items | 401.5 MB | 0 |
What's Changed
- Add new Azure AI models with pricing details by @emerzon in https://github.com/BerriAI/litellm/pull/15387
- [Feat] Add EnkryptAI Guardrails on LiteLLM by @ishaan-jaff in https://github.com/BerriAI/litellm/pull/15390
- [Fix] Fix dynamic Rate limiter v3 - inserting litellm_model_saturation by @ishaan-jaff in https://github.com/BerriAI/litellm/pull/15394
- [Fix] - Sessions not being shared by @AlexsanderHamir in https://github.com/BerriAI/litellm/pull/15388
- fix: Prevents DB from accidentally overriding config file values if they are empty in DB by @ARajan1084 in https://github.com/BerriAI/litellm/pull/15340
- Temporarily relax ResponsesAPIResponse parsing to support custom backends (e.g., vLLM) by @ashengstd in https://github.com/BerriAI/litellm/pull/15362
- Fix - OpenRouter cache_control to only apply to last content block by @ishaan-jaff in https://github.com/BerriAI/litellm/pull/15395
- [Fix]: remove panic from hot path by @AlexsanderHamir in https://github.com/BerriAI/litellm/pull/15396
- [Feat] Support for Vertex AI Gemma Models on Custom Endpoints by @ishaan-jaff in https://github.com/BerriAI/litellm/pull/15397
- [Perf]: optimize SSL/TLS handshake performance with prioritized cipher by @AlexsanderHamir in https://github.com/BerriAI/litellm/pull/15398
- fix: Ensure MCP client stays open during tool call by @uc4w6c in https://github.com/BerriAI/litellm/pull/15391
- Teams page: new column "Your Role" on the teams table by @ARajan1084 in https://github.com/BerriAI/litellm/pull/15384
- Add new together models by @zainhas in https://github.com/BerriAI/litellm/pull/15383
- feat: posthog per request api key by @carlos-marchal-ph in https://github.com/BerriAI/litellm/pull/15379
- Implement Shared Health Check State Across Pods by @Sameerlite in https://github.com/BerriAI/litellm/pull/15380
- Remove hardcoded "public" schema in 20250918083359_drop_spec_version_column_from_mcp_table/migration.sql by @vkolehmainen in https://github.com/BerriAI/litellm/pull/15363
- Minimal fix: gpt5 models should not go on cooldown when called with temperature!=1 by @jlan-nl in https://github.com/BerriAI/litellm/pull/15330
- Add W&B Inference documentation by @xprilion in https://github.com/BerriAI/litellm/pull/15278
- Add OCI Cohere support with tool calling and streaming capabilities by @Sameerlite in https://github.com/BerriAI/litellm/pull/15365
- Updates guardrail provider logos by @ARajan1084 in https://github.com/BerriAI/litellm/pull/15421
- LiteLLM Dashboard Teams UI refactor by @ARajan1084 in https://github.com/BerriAI/litellm/pull/15418
- Enforces removal of unused imports from UI by @ARajan1084 in https://github.com/BerriAI/litellm/pull/15416
- fix: usage page >> Model Activity >> spend per day graph: y-axis clipping on large spend values by @ARajan1084 in https://github.com/BerriAI/litellm/pull/15389
- [Fix] VertexAI - gemma model family support (custom endpoints) by @ishaan-jaff in https://github.com/BerriAI/litellm/pull/15419
- fix lint errors by @Sameerlite in https://github.com/BerriAI/litellm/pull/15406
- feat: add Bedrock AU Cross-Region Inference for Claude Sonnet 4.5 by @BCook98 in https://github.com/BerriAI/litellm/pull/15402
- [Feat] VertexAI Gemma model family streaming support + Added MedGemma by @ishaan-jaff in https://github.com/BerriAI/litellm/pull/15427
- Models & Endpoints Initial Refactor by @ARajan1084 in https://github.com/BerriAI/litellm/pull/15435
- [Feature]: Include server_name in /v1/mcp/server/health endpoint response by @Copilot in https://github.com/BerriAI/litellm/pull/15431
- [Fix] - SensitiveDataMasker converts lists to string by @AlexsanderHamir in https://github.com/BerriAI/litellm/pull/15420
- Deletion of docker-compose buggy comment that cause
config.yaml
based startup fail by @PabloGmz96 in https://github.com/BerriAI/litellm/pull/15425 - Litellm UI API Reference page updates by @ARajan1084 in https://github.com/BerriAI/litellm/pull/15438
- [Feat] Tag Management - Add support for setting tag based budgets by @ishaan-jaff in https://github.com/BerriAI/litellm/pull/15433
- [Fix] - shared session parsing and usage issue by @AlexsanderHamir in https://github.com/BerriAI/litellm/pull/15440
- [Fix]: handle closed aiohttp sessions by @AlexsanderHamir in https://github.com/BerriAI/litellm/pull/15442
- [Fix]: prevent session leaks when recreating aiohttp sessions by @AlexsanderHamir in https://github.com/BerriAI/litellm/pull/15443
- fix mapped tests 1 by @Sameerlite in https://github.com/BerriAI/litellm/pull/15445
New Contributors
- @ashengstd made their first contribution in https://github.com/BerriAI/litellm/pull/15362
- @vkolehmainen made their first contribution in https://github.com/BerriAI/litellm/pull/15363
- @jlan-nl made their first contribution in https://github.com/BerriAI/litellm/pull/15330
- @BCook98 made their first contribution in https://github.com/BerriAI/litellm/pull/15402
- @PabloGmz96 made their first contribution in https://github.com/BerriAI/litellm/pull/15425
Full Changelog: https://github.com/BerriAI/litellm/compare/v1.77.7.dev10...v1.78.0-nightly