--- license: mit base_model: gpt2-medium tags: - generated_from_trainer metrics: - accuracy model-index: - name: N-TSZ results: [] language: - en --- # N-TSZ This model is a fine-tuned version of [gpt2-medium](https://huggingface.co/gpt2-medium) on an unknown dataset. It achieves the following results on the evaluation set: - Loss: 4.0972 - Accuracy: 0.0 ## Model description Also Sprach Zarathustra! ## Intended uses & limitations The main intended use & limitation is to give a birth of a dancing star! ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 0.0001 - train_batch_size: 4 - eval_batch_size: 4 - seed: 42 - gradient_accumulation_steps: 2 - total_train_batch_size: 8 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - training_steps: 256 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | |:-------------:|:------:|:----:|:---------------:|:--------:| | 1.3356 | 2.0870 | 120 | 3.4775 | 0.0000 | | 0.1808 | 4.1739 | 240 | 4.0972 | 0.0 | ### Framework versions - Transformers 4.43.1 - Pytorch 2.3.1+cu121 - Datasets 2.20.0 - Tokenizers 0.19.1