Beto f63ba19827 Fixing tokenizer used for llama 3. Changing quantization configs on safety_utils. пре 1 година
..
configs fa0a389f74 add max_step feature for training and eval пре 1 година
data 4913d3ad24 Add missing copyright header пре 1 година
datasets 69db75d425 fix incorrect split of InstructionDataset пре 1 година
inference f63ba19827 Fixing tokenizer used for llama 3. Changing quantization configs on safety_utils. пре 1 година
model_checkpointing ce9501f22c remove relative imports пре 2 година
policies ce9501f22c remove relative imports пре 2 година
tools 4913d3ad24 Add missing copyright header пре 1 година
utils e6f69f84ad add max_steps_reached to reduce redundancy пре 1 година
finetuning.py 11f51db28c adding the kbit prep in the code пре 1 година