| Name | Modified | Size | Downloads / Week |
|---|---|---|---|
| Parent folder | |||
| Adapters v1.3.0 source code.tar.gz | 2026-04-26 | 15.1 MB | |
| Adapters v1.3.0 source code.zip | 2026-04-26 | 15.3 MB | |
| README.md | 2026-04-26 | 974 Bytes | |
| Totals: 3 Items | 30.4 MB | 0 | |
This version is built for Hugging Face Transformers v4.57.x.
New
Add support for DoRA adapters (@julian-fong via [#790]):
DoRA (Liu et al., 2024) is an enhanced LoRA variant. See https://docs.adapterhub.ml/methods.html#dora.
More
- Add EAT model interface for LoRA adapter support (@mgustineli via https://github.com/adapter-hub/adapters/pull/832)
Changed
- Upgrade supported Transformers version to v4.57.x (@lenglaender via [#814], @calpt via [#829])
Fixed
- Fix Prefix Finetuning for Group Query Attention (GQA) (@ha405 via [#825])
- Handle NotImplementedError from get_input_embeddings() during init (@mgustineli via https://github.com/adapter-hub/adapters/pull/831)
- Fix LoRAMergedLinear.get_n_heads() shape mismatch with fused QKV + "o" matrix (@mgustineli via https://github.com/adapter-hub/adapters/pull/830)
- Fix path traversal issue for downloading adapter archives (@calpt via [#833])