Просмотр исходного кода

Update docs/multi_gpu.md

Co-authored-by: Suraj Subramanian <5676233+subramen@users.noreply.github.com>
Matthias Reso 9 месяцев назад
Родитель
Сommit
b0b4e16aec
1 измененных файлов с 1 добавлено и 0 удалено
  1. 1 0
      docs/multi_gpu.md

+ 1 - 0
docs/multi_gpu.md

@@ -83,6 +83,7 @@ sbatch recipes/quickstart/finetuning/multi_node.slurm
 # Change the num nodes and GPU per nodes in the script before running.
 
 ```
+### Fine-tuning using FSDP on 405B Model
 
 To fine-tune the Meta Llama 405B model with LoRA on 32xH100, 80 GB GPUs we need to combine 4bit quantization (QLoRA) and FSDP.
 We can achieve this by adding the following environment variables to the slurm script (before the srun command in the bottom).