Download Latest Version Patch release 4.54.1 source code.zip (23.9 MB)
Email in envelope

Get an email when there's a new version of Transformers

Home / 4.54.1
Name Modified Size InfoDownloads / Week
Parent folder
Patch release 4.54.1 source code.tar.gz 2025-07-29 18.9 MB
Patch release 4.54.1 source code.zip 2025-07-29 23.9 MB
README.md 2025-07-29 881 Bytes
Totals: 3 Items   42.8 MB 1

Patch release 4.54.1

We had quite a lot of bugs that got through! Release was a bit rushed, sorry everyone! 🤗 Mostly cache fixes, as we now have layered cache, and fixed to distributed.

  • Fix Cache.max_cache_len max value for Hybrid models, @manueldeprada, @Cyrilvallez, [#39737]
  • [modenbert] fix regression, @zucchini-nlp, [#39750]
  • Fix version issue in modeling_utils.py, @Cyrilvallez, [#39759]
  • Fix GPT2 with cross attention, @zucchini-nlp, [#39754]
  • Fix mamba regression, @manueldeprada, [#39728]
  • Fix: add back base model plan, @S1ro1, [#39733]
  • fix cache inheritance, [#39748]
  • Fix cache-related tests, @zucchini-nlp, [#39676]
  • Fix Layer device placement in Caches, @Cyrilvallez, [#39732]
  • PATCH: add back n-dim device-mesh + fix tp trainer saving, @S1ro1, @SunMarc, [#39693]
  • fix missing model._tp_size from ep refactor, @winglian, [#39688]
Source: README.md, updated 2025-07-29