| Name | Modified | Size | Downloads / Week |
|---|---|---|---|
| Parent folder | |||
| 2.16.1 source code.tar.gz | 2026-02-12 | 39.4 MB | |
| 2.16.1 source code.zip | 2026-02-12 | 39.4 MB | |
| README.md | 2026-02-12 | 377 Bytes | |
| Totals: 3 Items | 78.7 MB | 0 | |
What's Changed
- Support efficient flash attention for packed sequences using flash-attn-2.0 by @muthissar in https://github.com/lucidrains/x-transformers/pull/350
New Contributors
- @muthissar made their first contribution in https://github.com/lucidrains/x-transformers/pull/350
Full Changelog: https://github.com/lucidrains/x-transformers/compare/2.16.0...2.16.1