História revízii

Autor SHA1 Správa Dátum
  Huang Zhihong a620831762 Fix the bug when continue the peft. (#717) 6 mesiacov pred
  Kai Wu 9c7a5b421f fix AutoModel and bump transformers version to 4.45 (#686) 6 mesiacov pred
  Kai Wu 57afa0b51e use AutoModel 7 mesiacov pred
  Kai Wu 2730bcaab7 fix readme and fsdp logic 7 mesiacov pred
  Kai Wu 50dff0b78e gradient_checkpointing_enable() 7 mesiacov pred
  Kai Wu c18a0d277f changed dataset to ocrvqa 7 mesiacov pred
  Kai Wu bd22f407d5 changed to aid2 dataset 7 mesiacov pred
  Kai Wu 1a76080807 lora+fsdp working 7 mesiacov pred
  Kai Wu 8a11b48022 lora+fsdp not working 7 mesiacov pred
  Kai Wu 79dbe05a94 batch fine-tuning lmm working 7 mesiacov pred
  Kai Wu ce299b3439 add get_custom_data_collator feature 7 mesiacov pred
  Kai Wu 12da109823 Merge branch 'main' into lmm_finetune 7 mesiacov pred
  Kai Wu bb990be967 not working, need create dataloader function 7 mesiacov pred
  Matthias Reso 778e31e35c Fix checkpoint saving (#650) 7 mesiacov pred
  Kai Wu ee204ccb98 working now 7 mesiacov pred
  Kai Wu b566582a86 finetune not working with fsdp 7 mesiacov pred
  Hamid Shojanazeri 808a3f7a0c Adding support for FSDP+Qlora. (#572) 9 mesiacov pred
  haozhx23 e6b0f97199 Fix hsdp_device_mesh=None when enable HSDP and HYBRID_SHARD (#402) 10 mesiacov pred
  Kai Wu 41a46d811d fix alpaca dataset by using 5% of the data as eval and make sure len((eval_loader)>0 11 mesiacov pred
  Kai Wu f1d90d0ff0 fix wandb config update 11 mesiacov pred
  Kai Wu 480c4f2b5e resume the finetuning given the path of the previous peft checkpoint folder 11 mesiacov pred
  Merovingian b4e1a420c8 Freeze layer bug fix 11 mesiacov pred
  Pavel Belevich fb2e802cef Fix param_init_fn: move if-statement out of lambda 11 mesiacov pred
  Beto 65559c1640 Merge branch 'main' of github.com:albertodepaola/llama-recipes-private into main 1 rok pred
  Matthias Reso 43cb6a2db4 Remove check for nighlies for low_cpu_fsdp and bump torch version to 2.2 instead 1 rok pred
  Matthias Reso 113ea18bf1 Replace LlamaTokenizer with AutoTokenizer 1 rok pred
  Rahul A R f8183b96fe use new tokenizer_name argument and resize embeddings if required 1 rok pred
  Hamid Shojanazeri 11f51db28c adding the kbit prep in the code 1 rok pred
  Hamid Shojanazeri f058ff6ccd update due to peft new release 1 rok pred
  Hamid Shojanazeri ffdc93f00a Merge branch 'main' into wandb_logging 1 rok pred