| .. | 
		
		
			
			
			
				
					| __init__.py | 207d2f80e9
					Make code-llama and hf-tgi inference runnable as module | 2 лет назад | 
		
			
			
			
				
					| chat_utils.py | e554c1c8bf
					The tokenizer will not add eos_token by default | 2 лет назад | 
		
			
			
			
				
					| checkpoint_converter_fsdp_hf.py | ce9501f22c
					remove relative imports | 2 лет назад | 
		
			
			
			
				
					| llm.py | a404c9249c
					Notebook to demonstrate using llama and llama-guard together using OctoAI | 1 год назад | 
		
			
			
			
				
					| model_utils.py | d51d2cce9c
					adding sdpa for flash attn | 1 год назад | 
		
			
			
			
				
					| prompt_format_utils.py | 3e710f71f8
					renaming the prompt format file to conform to repo standards | 1 год назад | 
		
			
			
			
				
					| safety_utils.py | c0886a0a89
					Fixing typo in self | 1 год назад |