| .. | 
			
		
		
			
			
			
				
					| 
						
							
						
						configs
					 | 
				
				
					fa0a389f74
					add max_step feature for training and eval
				 | 
				il y a 1 an | 
			
		
			
			
			
				
					| 
						
							
						
						data
					 | 
				
				
					4913d3ad24
					Add missing copyright header
				 | 
				il y a 1 an | 
			
		
			
			
			
				
					| 
						
							
						
						datasets
					 | 
				
				
					69db75d425
					fix incorrect split of InstructionDataset
				 | 
				il y a 1 an | 
			
		
			
			
			
				
					| 
						
							
						
						inference
					 | 
				
				
					f63ba19827
					Fixing tokenizer used for llama 3. Changing quantization configs on safety_utils.
				 | 
				il y a 1 an | 
			
		
			
			
			
				
					| 
						
							
						
						model_checkpointing
					 | 
				
				
					ce9501f22c
					remove relative imports
				 | 
				il y a 2 ans | 
			
		
			
			
			
				
					| 
						
							
						
						policies
					 | 
				
				
					ce9501f22c
					remove relative imports
				 | 
				il y a 2 ans | 
			
		
			
			
			
				
					| 
						
							
						
						tools
					 | 
				
				
					4913d3ad24
					Add missing copyright header
				 | 
				il y a 1 an | 
			
		
			
			
			
				
					| 
						
							
						
						utils
					 | 
				
				
					e6f69f84ad
					add max_steps_reached to reduce redundancy
				 | 
				il y a 1 an | 
			
		
			
			
			
				
					| 
						
							
						
						finetuning.py
					 | 
				
				
					11f51db28c
					adding the kbit prep in the code
				 | 
				il y a 1 an |