Model Card for Model mistral7b-v0.3-ultrachat200k

Trained using LoRA with r=32 for 1 epoch on 208k examples of chats from HuggingFaceH4/ultrachat_200k with max_seq_len 16384.

Downloads last month
18
Safetensors
Model size
7.25B params
Tensor type
BF16
·
Inference Providers NEW
This model is not currently available via any of the supported third-party Inference Providers, and HF Inference API was unable to determine this model’s pipeline type.