|  Kai Wu | d9558c11ca
							
							changed context name and add more docs | 1 年之前 | 
				
					
						|  Hamid Shojanazeri | 554396d4ec
							
							bumping transformer versions for llama3 support | 1 年之前 | 
				
					
						|  Hamid Shojanazeri | d717be8ad4
							
							Merge pull request #3 from albertodepaola/l3p/finetuning_inference_chat_mods | 1 年之前 | 
				
					
						|  Matthias Reso | 43cb6a2db4
							
							Remove check for nighlies for low_cpu_fsdp and bump torch version to 2.2 instead | 1 年之前 | 
				
					
						|  varunfb | a404c9249c
							
							Notebook to demonstrate using llama and llama-guard together using OctoAI | 1 年之前 | 
				
					
						|  Joone Hur | aec45aed81
							
							Add gradio to requirements.txt | 1 年之前 | 
				
					
						|  Beto | 7474514fe0
							
							Merging with main | 1 年之前 | 
				
					
						|  Beto | 7881b3bb99
							
							Changing safety utils to use HF classes to load Llama Guard. Removing Llama plain inference code | 1 年之前 | 
				
					
						|  Beto | 92be45b0fe
							
							Adding matplotlib to requirements. Removing import from train_utils | 2 年之前 | 
				
					
						|  Matthias Reso | 1c473b6e7c
							
							remove --find-links which is unsupported by packaging backends; Update documentation how to retireve correct pytorch version | 2 年之前 | 
				
					
						|  Matthias Reso | bf152a7dcb
							
							Upgrade torch requirement to 2.1 RC | 2 年之前 | 
				
					
						|  Matthias Reso | 5b6858949d
							
							remove version pinning from bitsandbytes | 2 年之前 | 
				
					
						|  Matthias Reso | 31fabb254a
							
							Make vllm optional | 2 年之前 | 
				
					
						|  Matthias Reso | 2717048197
							
							Add vllm and pytest as dependencies | 2 年之前 | 
				
					
						|  Matthias Reso | 02428c992a
							
							Adding vllm as dependency; fix dep install with hatchling | 2 年之前 | 
				
					
						|  Matthias Reso | c8522eb0ff
							
							Remove peft install from src | 2 年之前 | 
				
					
						|  Hamid Shojanazeri | 44ef280d31
							
							adding flash attention and xformer memory efficient through PT SDPA | 2 年之前 | 
				
					
						|  Hamid Shojanazeri | 954f6e741c
							
							update transformers version requirement | 2 年之前 | 
				
					
						|  chauhang | 4767f09ecd
							
							Initial commit | 2 年之前 |