lesso18's picture
End of training
e869665 verified
metadata
library_name: peft
license: apache-2.0
base_model: NousResearch/Nous-Hermes-2-SOLAR-10.7B
tags:
  - axolotl
  - generated_from_trainer
model-index:
  - name: ecc2b8dc-3f2b-485b-ad43-af7d64536eea
    results: []

Built with Axolotl

ecc2b8dc-3f2b-485b-ad43-af7d64536eea

This model is a fine-tuned version of NousResearch/Nous-Hermes-2-SOLAR-10.7B on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.3961

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.000218
  • train_batch_size: 4
  • eval_batch_size: 4
  • seed: 180
  • gradient_accumulation_steps: 2
  • total_train_batch_size: 8
  • optimizer: Use OptimizerNames.ADAMW_BNB with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
  • lr_scheduler_type: cosine
  • lr_scheduler_warmup_steps: 50
  • training_steps: 500

Training results

Training Loss Epoch Step Validation Loss
No log 0.0002 1 0.9534
0.5504 0.0104 50 0.5831
0.6452 0.0208 100 0.5537
0.5554 0.0312 150 0.5378
0.566 0.0416 200 0.5338
0.4541 0.0520 250 0.5013
0.4437 0.0624 300 0.4651
0.4195 0.0728 350 0.4388
0.4642 0.0832 400 0.4156
0.3864 0.0936 450 0.3997
0.4153 0.1040 500 0.3961

Framework versions

  • PEFT 0.13.2
  • Transformers 4.46.0
  • Pytorch 2.5.0+cu124
  • Datasets 3.0.1
  • Tokenizers 0.20.1