Author | SHA1 Message | Date |
---|---|---|
|
44ef280d31 adding flash attention and xformer memory efficient through PT SDPA | 2 years ago |
|
71fdc4920a Save memory and fix typos | 2 years ago |
|
7ec390bfc8 aliging special tokens in toeknizer with HF latest | 2 years ago |
|
d3d7a1656e Update llama_finetuning.py | 2 years ago |
|
4767f09ecd Initial commit | 2 years ago |