Name | Modified | Size | Downloads / Week |
---|---|---|---|
Parent folder | |||
README.md | 2023-10-12 | 716 Bytes | |
Release v1.1.1 source code.tar.gz | 2023-10-12 | 13.4 MB | |
Release v1.1.1 source code.zip | 2023-10-12 | 13.5 MB | |
Totals: 3 Items | 26.8 MB | 0 |
In this version, we release the training scripts, data, and evaluation scripts on benchmarks for LLaVA 1.5. Bake your LLaVA today!
LLaVA-1.5 achieves SoTA on 11 benchmarks, with just simple modifications to the original LLaVA, utilizes all public data, completes training in ~1 day on a single 8-A100 node, and surpasses methods like Qwen-VL-Chat that use billion-scale data. Check out the technical report, and explore the demo! Models are available in Model Zoo!