Name | Modified | Size | Downloads / Week |
---|---|---|---|
Parent folder | |||
pytorch_lightning-2.5.2.tar.gz | 2025-06-20 | 636.9 kB | |
lightning-2.5.2.tar.gz | 2025-06-20 | 633.4 kB | |
pytorch_lightning-2.5.2-py3-none-any.whl | 2025-06-20 | 825.4 kB | |
lightning-2.5.2-py3-none-any.whl | 2025-06-20 | 821.1 kB | |
lightning_fabric-2.5.2.tar.gz | 2025-06-20 | 195.9 kB | |
lightning_fabric-2.5.2-py3-none-any.whl | 2025-06-20 | 250.8 kB | |
Lightning v2.5.2 source code.tar.gz | 2025-06-19 | 16.4 MB | |
Lightning v2.5.2 source code.zip | 2025-06-19 | 17.1 MB | |
README.md | 2025-06-19 | 3.2 kB | |
Totals: 9 Items | 36.8 MB | 0 |
Notable changes in this release
PyTorch Lightning
Changed
- Add `enable_autolog_hparams` argument to Trainer ([#20593](https://github.com/Lightning-AI/pytorch-lightning/pull/20593)) - Add `toggled_optimizer(optimizer)` method to the LightningModule, which is a context manager version of `toggle_optimize` and `untoggle_optimizer` ([#20771](https://github.com/Lightning-AI/pytorch-lightning/pull/20771)) - For cross-device local checkpoints, instruct users to install `fsspec>=2025.5.0` if unavailable ([#20780](https://github.com/Lightning-AI/pytorch-lightning/pull/20780)) - Check param is of `nn.Parameter` type for pruning sanitization ([#20783](https://github.com/Lightning-AI/pytorch-lightning/pull/20783))Fixed
- Fixed `save_hyperparameters` not working correctly with `LightningCLI` when there are parsing links applied on instantiation ([#20777](https://github.com/Lightning-AI/pytorch-lightning/pull/20777)) - Fixed `logger_connector` has an edge case where step can be a float ([#20692](https://github.com/Lightning-AI/pytorch-lightning/pull/20692)) - Fixed Synchronize SIGTERM Handling in DDP to Prevent Deadlocks ([#20825](https://github.com/Lightning-AI/pytorch-lightning/pull/20825)) - Fixed case-sensitive model name ([#20661](https://github.com/Lightning-AI/pytorch-lightning/pull/20661)) - CLI: resolve jsonargparse deprecation warning ([#20802](https://github.com/Lightning-AI/pytorch-lightning/pull/20802)) - Fix: move `check_inputs` to the target device if available during `to_torchscript` ([#20873](https://github.com/Lightning-AI/pytorch-lightning/pull/20873)) - Fixed progress bar display to correctly handle iterable dataset and `max_steps` during training ([#20869](https://github.com/Lightning-AI/pytorch-lightning/pull/20869)) - Fixed problem for silently supporting `jsonnet` ([#20899](https://github.com/Lightning-AI/pytorch-lightning/pull/20899))Lightning Fabric
Changed
- Ensure correct device is used for autocast when mps is selected as Fabric accelerator ([#20876](https://github.com/Lightning-AI/pytorch-lightning/pull/20876))Removed
- Fix: `TransformerEnginePrecision` conversion for layers with `bias=False` ([#20805](https://github.com/Lightning-AI/pytorch-lightning/pull/20805))Full commit list: 2.5.1 -> 2.5.2
Contributors
We thank all folks who submitted issues, features, fixes, and doc changes. It's the only way we can collectively make Lightning :zap: better for everyone, nice job!
In particular, we would like to thank the authors of the pull-requests above, in no particular order:
@adamjstewart, @Armannas, @bandpooja, @Borda, @chanokin, @duydl, @GdoongMathew, @KAVYANSHTYAGI, @mauvilsa, @muthissar, @rustamzh, @siemdejong
Thank you :heart: and we hope you'll keep them coming!