Historique des commits

Auteur SHA1 Message Date
  Matthias Reso c167945448 remove 405B ft doc il y a 7 mois
  Matthias Reso b0b4e16aec Update docs/multi_gpu.md il y a 7 mois
  Matthias Reso b319a9fb8c Fix lint issue il y a 7 mois
  Matthias Reso afb3b75892 Add 405B + QLoRA + FSDP to multi_gpu.md doc il y a 7 mois
  Hamid Shojanazeri 808a3f7a0c Adding support for FSDP+Qlora. (#572) il y a 7 mois
  Kai Wu f6617fb86a changed --pure_bf16 to --fsdp_config.pure_bf16 and corrected "examples/" path (#587) il y a 7 mois
  Pia Papanna 4344a420f2 recipes/quickstart folder updated il y a 8 mois
  Kai Wu f1d90d0ff0 fix wandb config update il y a 9 mois
  Kai Wu 98c0284a7c updated readme for new finetune config il y a 9 mois
  Kerim 9da9f01cdc fixes il y a 9 mois
  Kai Wu 7a08c27879 Merge branch 'main' into fix/finetune_readme il y a 10 mois
  Kai Wu 13f2734e25 updated finetuning readme to Meta Llama 3 il y a 10 mois
  Kai Wu 26e877fd42 changed readme, unified the context interface and added get_flops_per_sec() il y a 10 mois
  Kai Wu d9558c11ca changed context name and add more docs il y a 10 mois
  Suraj Subramanian 6d449a859b New folder structure (#1) il y a 11 mois
  Matthias Reso 5446ea7999 Purge last remaining llama_finetuning.py doc refs il y a 1 an
  Matthias Reso 72a9832571 Merge branch 'main' into feature/package_distribution il y a 1 an
  Matthias Reso 360a658262 Adjusted docs to reflect move of qs nb + finetuning script into examples il y a 1 an
  Matthias Reso 34e45490ba Exchange micro_batching_size against gradient_accumulation_steps in docs il y a 1 an
  Matthias Reso 6c38cbeb6e Update dataset folder il y a 1 an
  Matthias Reso 789846a548 Update docs/multi_gpu.md docs/single_gpu.md with package context il y a 1 an
  lchu feaa344af3 resolve conflicts il y a 1 an
  Hamid Shojanazeri 8fddaa9966 resolving conflicts il y a 1 an
  Hamid Shojanazeri 75f291fe1c resolved conflicts il y a 1 an
  Chengyu Ma e741dde675 fix some typos il y a 1 an
  Philipp Parzer e6329a1fac Fix filename typo il y a 1 an