Download Latest Version Release v1.2.0 (LLaVA-1.6) source code.zip (13.3 MB)
Email in envelope

Get an email when there's a new version of LLaVA

Home / v1.1.3
Name Modified Size InfoDownloads / Week
Parent folder
README.md 2023-10-26 812 Bytes
Release v1.1.3 (Bring your own data, LoRA training) source code.tar.gz 2023-10-26 13.2 MB
Release v1.1.3 (Bring your own data, LoRA training) source code.zip 2023-10-26 13.3 MB
Totals: 3 Items   26.5 MB 0

Updates

  • Support LoRA for the instruction tuning stage of LLaVA-1.5 -- comparable performance to full-model finetuning, and reduced requirements on GPU VRAM. (ckpts/logs, script)
  • Bring your own data and finetune LLaVA-1.5 to your own task. (instruction)
  • Basic support for Windows. (instruction)
  • Fix: the training behavior with gradient accumulation is the same as large-batch training.

Notes

  • A new LoRA schedule for LLaVA-1.5 is used,
  • rank: 128
  • alpha: 256
  • lr (LoRA): 2e-4
  • lr (projector): 2e-5
Source: README.md, updated 2023-10-26