From d1566968339d6149954a68f37e0be906a8d7376f Mon Sep 17 00:00:00 2001 From: Sypherd Date: Tue, 19 Sep 2023 08:05:26 -0600 Subject: [PATCH] Specify absolute path in READMEs --- README.md | 8 ++++---- dataset/README_1_STAGE.md | 4 ++-- dataset/README_2_STAGE.md | 2 +- 3 files changed, 7 insertions(+), 7 deletions(-) diff --git a/README.md b/README.md index 02bc504..f09367c 100644 --- a/README.md +++ b/README.md @@ -64,9 +64,9 @@ Download the corresponding LLM weights from the following huggingface space via [Downlad](https://huggingface.co/Vision-CAIR/vicuna/tree/main) | [Download](https://huggingface.co/Vision-CAIR/vicuna-7b/tree/main) | [Download](https://huggingface.co/meta-llama/Llama-2-7b-chat-hf/tree/main) -Then, set the path to the vicuna weight in the model config file +Then, set the absolute path to the vicuna weight in the model config file [here](minigpt4/configs/models/minigpt4_vicuna0.yaml#L18) at Line 18 -and/or the path to the llama2 weight in the model config file +and/or the absolute path to the llama2 weight in the model config file [here](minigpt4/configs/models/minigpt4_llama2.yaml#L15) at Line 15. **3. Prepare the pretrained MiniGPT-4 checkpoint** @@ -78,7 +78,7 @@ Download the pretrained checkpoints according to the Vicuna model you prepare. [Downlad](https://drive.google.com/file/d/1a4zLvaiDBr-36pasffmgpvH5P7CKmpze/view?usp=share_link) | [Download](https://drive.google.com/file/d/1RY9jV0dyqLX-o38LrumkKRh6Jtaop58R/view?usp=sharing) | [Download](https://drive.google.com/file/d/11nAPjEok8eAGGEG1N2vXo3kBLCg0WgUk/view?usp=sharing) -Then, set the path to the pretrained checkpoint in the evaluation config file +Then, set the absolute path to the pretrained checkpoint in the evaluation config file in [eval_configs/minigpt4_eval.yaml](eval_configs/minigpt4_eval.yaml#L10) at Line 8 for Vicuna version or [eval_configs/minigpt4_llama2_eval.yaml](eval_configs/minigpt4_llama2_eval.yaml#L10) for LLama2 version. @@ -137,7 +137,7 @@ and convert it to a conversation format to further align MiniGPT-4. To download and prepare our second stage dataset, please check our [second stage dataset preparation instruction](dataset/README_2_STAGE.md). To launch the second stage alignment, -first specify the path to the checkpoint file trained in stage 1 in +first specify the absolute path to the checkpoint file trained in stage 1 in [train_configs/minigpt4_stage1_pretrain.yaml](train_configs/minigpt4_stage2_finetune.yaml). You can also specify the output path there. Then, run the following command. In our experiments, we use 1 A100. diff --git a/dataset/README_1_STAGE.md b/dataset/README_1_STAGE.md index 5c92b92..dfe4516 100644 --- a/dataset/README_1_STAGE.md +++ b/dataset/README_1_STAGE.md @@ -84,11 +84,11 @@ The final dataset structure ## Set up the dataset configuration files -Then, set up the LAION dataset loading path in +Then, set up the absolute LAION dataset loading path in [here](../minigpt4/configs/datasets/laion/defaults.yaml#L5) at Line 5 as ${MINIGPT4_DATASET}/laion/laion_dataset/{00000..10488}.tar -and the Conceptual Captoin and SBU datasets loading path in +and the absolute Conceptual Captoin and SBU datasets loading path in [here](../minigpt4/configs/datasets/cc_sbu/defaults.yaml#L5) at Line 5 as ${MINIGPT4_DATASET}/cc_sbu/cc_sbu_dataset/{00000..01255}.tar diff --git a/dataset/README_2_STAGE.md b/dataset/README_2_STAGE.md index b826765..e2e3893 100644 --- a/dataset/README_2_STAGE.md +++ b/dataset/README_2_STAGE.md @@ -14,6 +14,6 @@ cc_sbu_align ``` Put the folder to any path you want. -Then, set up the dataset path in the dataset config file +Then, set up the absolute dataset path in the dataset config file [here](../minigpt4/configs/datasets/cc_sbu/align.yaml#L5) at Line 5.