max_steps = 1000
learning_rate = 5e-7
label_smoothing = 0.2 # somewhere between 0 and 0.5
warmup_ratio = 0.1
dpo_beta = 0.01
use_rslora = False
use_loftq = False
lora_rank = 16
lora_alpha = 16
lora_dropout = 0.05
load_separate_reference_model = False
max_seq_length = 2048
eval_steps = 200
train_split = 0.008

Downloads last month
4
Safetensors
Model size
7.24B params
Tensor type
BF16
·
Inference Providers NEW
This model is not currently available via any of the supported third-party Inference Providers, and the model is not deployed on the HF Inference API.

Model tree for andysalerno/openchat-nectar-0.19

Finetuned
(31)
this model

Dataset used to train andysalerno/openchat-nectar-0.19