dba8af2 68d8b3a dba8af2
1
2
3
4
5
6
# Training details Dataset : llama tokenizer 4000 distribution : 85% 15% only for attributes number of epochs : 3 Progressive : no