Преглед изворни кода

Update docs/multi_gpu.md

Co-authored-by: Suraj Subramanian <5676233+subramen@users.noreply.github.com>
Matthias Reso пре 9 месеци
родитељ
комит
b0b4e16aec
1 измењених фајлова са 1 додато и 0 уклоњено
  1. 1 0
      docs/multi_gpu.md

+ 1 - 0
docs/multi_gpu.md

@@ -83,6 +83,7 @@ sbatch recipes/quickstart/finetuning/multi_node.slurm
 # Change the num nodes and GPU per nodes in the script before running.
 
 ```
+### Fine-tuning using FSDP on 405B Model
 
 To fine-tune the Meta Llama 405B model with LoRA on 32xH100, 80 GB GPUs we need to combine 4bit quantization (QLoRA) and FSDP.
 We can achieve this by adding the following environment variables to the slurm script (before the srun command in the bottom).