|
9 месяцев назад | |
---|---|---|
.. | ||
RAG | 9 месяцев назад | |
Running_Llama3_Anywhere | 9 месяцев назад | |
agents | 9 месяцев назад | |
finetuning | 9 месяцев назад | |
inference | 9 месяцев назад | |
Getting_to_know_Llama.ipynb | 10 месяцев назад | |
Prompt_Engineering_with_Llama_3.ipynb | 9 месяцев назад | |
README.md | 9 месяцев назад |
If you are new to developing with Meta Llama models, this is where you should start. This folder contains introductory-level notebooks across different techniques relating to Meta Llama.
| Feature | | | ---------------------------------------------- | - | | HF support for finetuning | ✅ | | Deferred initialization ( meta init) | ✅ | | HF support for inference | ✅ | | Low CPU mode for multi GPU | ✅ | | Mixed precision | ✅ | | Single node quantization | ✅ | | Flash attention | ✅ | | PEFT | ✅ | | Activation checkpointing FSDP | ✅ | | Hybrid Sharded Data Parallel (HSDP) | ✅ | | Dataset packing & padding | ✅ | | BF16 Optimizer ( Pure BF16) | ✅ | | Profiling & MFU tracking | ✅ | | Gradient accumulation | ✅ | | CPU offloading | ✅ | | FSDP checkpoint conversion to HF for inference | ✅ | | W&B experiment tracker | ✅ |