CTMAE-P2-V3-3G-S1

This model is a fine-tuned version of MCG-NJU/videomae-large-finetuned-kinetics on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 0.5835
  • Accuracy: 0.8696

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 1e-05
  • train_batch_size: 2
  • eval_batch_size: 2
  • seed: 42
  • optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_ratio: 0.1
  • training_steps: 6500

Training results

Training Loss Epoch Step Validation Loss Accuracy
0.6279 0.02 130 0.7452 0.4783
0.6448 1.02 260 0.9604 0.4783
0.675 2.02 390 0.7935 0.4783
0.5523 3.02 520 0.8305 0.4783
0.9104 4.02 650 0.8216 0.4783
0.5657 5.02 780 1.2400 0.4783
1.1352 6.02 910 0.6858 0.4783
0.77 7.02 1040 0.9103 0.5217
1.4024 8.02 1170 0.9320 0.6522
0.7694 9.02 1300 1.1192 0.5652
0.663 10.02 1430 0.8375 0.6739
1.0107 11.02 1560 0.9901 0.6087
0.8404 12.02 1690 0.4649 0.7826
0.7372 13.02 1820 1.2412 0.6739
1.2033 14.02 1950 1.5908 0.6304
0.7936 15.02 2080 0.8874 0.7174
1.3059 16.02 2210 0.6237 0.7826
1.0162 17.02 2340 0.6233 0.8043
0.9241 18.02 2470 1.5554 0.6304
0.9925 19.02 2600 0.6251 0.8043
0.844 20.02 2730 1.0150 0.7174
0.6418 21.02 2860 0.5920 0.8043
1.0493 22.02 2990 0.8085 0.7826
0.6551 23.02 3120 1.5049 0.6957
0.596 24.02 3250 0.7728 0.7826
0.5281 25.02 3380 0.7842 0.8043
0.4911 26.02 3510 0.6299 0.8043
0.2105 27.02 3640 0.8429 0.7826
0.6859 28.02 3770 1.0266 0.7391
0.5047 29.02 3900 1.0786 0.7609
0.8081 30.02 4030 0.6552 0.8478
0.4284 31.02 4160 0.5835 0.8696
0.3249 32.02 4290 0.9921 0.7826
0.8226 33.02 4420 0.7955 0.8261
0.0009 34.02 4550 1.0117 0.8043
0.6603 35.02 4680 1.4238 0.7609
0.144 36.02 4810 1.0399 0.7826
0.4473 37.02 4940 0.9877 0.8043
0.0012 38.02 5070 0.9295 0.8043
0.1138 39.02 5200 1.1066 0.7826
0.4031 40.02 5330 1.2339 0.7826
0.1228 41.02 5460 1.0563 0.8261
0.224 42.02 5590 1.0712 0.8043
0.0445 43.02 5720 1.2617 0.7609
0.1742 44.02 5850 1.1758 0.7826
0.5739 45.02 5980 1.4026 0.7391
0.1828 46.02 6110 1.3709 0.7609
0.0153 47.02 6240 1.2819 0.8043
0.0002 48.02 6370 1.3020 0.8043
0.0022 49.02 6500 1.2445 0.8043

Framework versions

  • Transformers 4.46.2
  • Pytorch 2.0.1+cu117
  • Datasets 3.0.1
  • Tokenizers 0.20.0
Downloads last month
17
Safetensors
Model size
304M params
Tensor type
F32
·
Inference Providers NEW
This model is not currently available via any of the supported third-party Inference Providers, and the model is not deployed on the HF Inference API.

Model tree for beingbatman/CTMAE-P2-V3-3G-S1

Finetuned
(46)
this model