Name | Modified | Size | Downloads / Week |
---|---|---|---|
Parent folder | |||
0.17.1 source code.tar.gz | 2025-08-21 | 7.8 MB | |
0.17.1 source code.zip | 2025-08-21 | 8.3 MB | |
README.md | 2025-08-21 | 687 Bytes | |
Totals: 3 Items | 16.1 MB | 1 |
This patch release contains a few fixes (via [#2710]) for the newly introduced target_parameters
feature, which allows LoRA to target nn.Parameter
s directly (useful for mixture of expert layers). Most notably:
- PEFT no longer removes possibly existing parametrizations from the parameter.
- Adding multiple adapters (via
model.add_adapter
ormodel.load_adapter
) did not work correctly. Since a solution is not trivial, PEFT now raises an error to prevent this situation.