CTMAE-P2-V3-3G-S4

This model is a fine-tuned version of MCG-NJU/videomae-large-finetuned-kinetics on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 0.8000
  • Accuracy: 0.8222

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 1e-05
  • train_batch_size: 2
  • eval_batch_size: 2
  • seed: 42
  • optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_ratio: 0.1
  • training_steps: 6500

Training results

Training Loss Epoch Step Validation Loss Accuracy
0.649 0.0202 131 0.8224 0.3778
0.4184 1.0202 262 1.4642 0.3778
0.7051 2.0202 393 1.1787 0.3778
0.5842 3.0202 524 0.9482 0.3778
0.9705 4.0202 655 1.0651 0.3778
0.6554 5.0202 786 0.9533 0.3778
0.8662 6.0202 917 1.2020 0.3778
1.8391 7.0202 1048 1.3808 0.3778
0.492 8.0202 1179 1.0200 0.3778
0.3752 9.0202 1310 0.8383 0.5778
0.8025 10.0202 1441 1.2180 0.5111
0.6779 11.0202 1572 1.9618 0.3778
0.598 12.0202 1703 0.7358 0.7111
1.4615 13.0202 1834 0.9323 0.6222
0.6769 14.0202 1965 1.1934 0.6222
0.398 15.0202 2096 1.3051 0.6444
0.2597 16.0202 2227 0.6407 0.7333
0.9731 17.0202 2358 0.8000 0.8222
0.9509 18.0202 2489 1.0755 0.7111
0.3026 19.0202 2620 1.6900 0.6444
0.3618 20.0202 2751 1.8778 0.6
0.2526 21.0202 2882 2.0385 0.6
1.7509 22.0202 3013 1.9079 0.6
1.0213 23.0202 3144 1.3900 0.7333
0.1836 24.0202 3275 1.8195 0.6222
0.2277 25.0202 3406 2.1068 0.5778
0.3344 26.0202 3537 2.1472 0.6222
0.5114 27.0202 3668 2.5289 0.5778
0.0018 28.0202 3799 2.3118 0.6444
0.1634 29.0202 3930 2.7060 0.5778
0.2339 30.0202 4061 2.3984 0.6222
0.0997 31.0202 4192 3.5809 0.5111
0.8974 32.0202 4323 2.8206 0.5556
0.4314 33.0202 4454 3.5183 0.4667
0.0006 34.0202 4585 2.4841 0.6222
0.185 35.0202 4716 3.3856 0.5556
0.1699 36.0202 4847 2.9514 0.5778
0.4543 37.0202 4978 2.5094 0.6444
0.0003 38.0202 5109 2.2945 0.6667
0.3989 39.0202 5240 2.7996 0.6
0.0853 40.0202 5371 3.2383 0.5778
0.2906 41.0202 5502 3.0334 0.6
0.0006 42.0202 5633 2.9619 0.6
0.0001 43.0202 5764 3.3620 0.6
0.0021 44.0202 5895 3.2390 0.5778
0.1838 45.0202 6026 3.3982 0.5778
0.0028 46.0202 6157 3.3721 0.5778
0.0002 47.0202 6288 3.4766 0.5778
0.067 48.0202 6419 3.3522 0.5778
0.0001 49.0125 6500 3.3565 0.5778

Framework versions

  • Transformers 4.46.2
  • Pytorch 2.0.1+cu117
  • Datasets 3.0.1
  • Tokenizers 0.20.0
Downloads last month
17
Safetensors
Model size
304M params
Tensor type
F32
·
Inference Providers NEW
This model is not currently available via any of the supported third-party Inference Providers, and the model is not deployed on the HF Inference API.

Model tree for beingbatman/CTMAE-P2-V3-3G-S4

Finetuned
(46)
this model