Skip to content

Latest commit

 

History

History
24 lines (14 loc) · 1.04 KB

MiniGPTv2_Train.md

File metadata and controls

24 lines (14 loc) · 1.04 KB

Finetune of MiniGPT-4

You firstly need to prepare the dataset. you can follow this step to prepare the dataset. our dataset preparation.

In the train_configs/minigptv2_finetune.yaml, you need to set up the following paths:

llama_model checkpoint path: "/path/to/llama_checkpoint"

ckpt: "/path/to/pretrained_checkpoint"

ckpt save path: "/path/to/save_checkpoint"

For ckpt, you may load from our pretrained model checkpoints:

MiniGPT-v2 (after stage-2) MiniGPT-v2 (after stage-3) MiniGPT-v2 (online developing demo)
Download Download Download
torchrun --nproc-per-node NUM_GPU train.py --cfg-path train_configs/minigptv2_finetune.yaml