detr_finetuned_cppe5
This model is a fine-tuned version of microsoft/conditional-detr-resnet-50 on an unknown dataset. It achieves the following results on the evaluation set:
- Loss: 1.2223
- Map: 0.3117
- Map 50: 0.6293
- Map 75: 0.2667
- Map Small: 0.106
- Map Medium: 0.2851
- Map Large: 0.4545
- Mar 1: 0.3195
- Mar 10: 0.4877
- Mar 100: 0.5017
- Mar Small: 0.3298
- Mar Medium: 0.4402
- Mar Large: 0.6375
- Map Coverall: 0.5742
- Mar 100 Coverall: 0.7165
- Map Face Shield: 0.2986
- Mar 100 Face Shield: 0.5309
- Map Gloves: 0.2081
- Mar 100 Gloves: 0.3644
- Map Goggles: 0.1561
- Mar 100 Goggles: 0.4509
- Map Mask: 0.3214
- Mar 100 Mask: 0.4458
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: cosine
- num_epochs: 30
Training results
Training Loss | Epoch | Step | Validation Loss | Map | Map 50 | Map 75 | Map Small | Map Medium | Map Large | Mar 1 | Mar 10 | Mar 100 | Mar Small | Mar Medium | Mar Large | Map Coverall | Mar 100 Coverall | Map Face Shield | Mar 100 Face Shield | Map Gloves | Mar 100 Gloves | Map Goggles | Mar 100 Goggles | Map Mask | Mar 100 Mask |
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
No log | 1.0 | 107 | 2.2632 | 0.0062 | 0.0235 | 0.0011 | 0.0073 | 0.0103 | 0.0113 | 0.0201 | 0.0898 | 0.1405 | 0.1997 | 0.1623 | 0.174 | 0.0017 | 0.0824 | 0.0034 | 0.0471 | 0.0033 | 0.1955 | 0.0037 | 0.0727 | 0.019 | 0.3047 |
No log | 2.0 | 214 | 2.0234 | 0.0428 | 0.1012 | 0.0317 | 0.021 | 0.0786 | 0.0501 | 0.1375 | 0.2463 | 0.2875 | 0.2315 | 0.3098 | 0.3033 | 0.1382 | 0.5733 | 0.0104 | 0.1529 | 0.0071 | 0.2379 | 0.0034 | 0.1182 | 0.0549 | 0.3553 |
No log | 3.0 | 321 | 1.8733 | 0.0639 | 0.1631 | 0.0411 | 0.0314 | 0.1147 | 0.0873 | 0.1589 | 0.3112 | 0.3549 | 0.2757 | 0.3445 | 0.4228 | 0.168 | 0.6642 | 0.0241 | 0.2309 | 0.0237 | 0.2819 | 0.0083 | 0.2091 | 0.0956 | 0.3884 |
No log | 4.0 | 428 | 1.7409 | 0.115 | 0.2879 | 0.0752 | 0.07 | 0.1591 | 0.1481 | 0.1731 | 0.3487 | 0.3892 | 0.3703 | 0.3574 | 0.4902 | 0.276 | 0.617 | 0.0547 | 0.3471 | 0.0586 | 0.3339 | 0.0157 | 0.2436 | 0.17 | 0.4042 |
2.0328 | 5.0 | 535 | 1.6217 | 0.1734 | 0.3876 | 0.1395 | 0.1021 | 0.2176 | 0.239 | 0.2162 | 0.4068 | 0.4342 | 0.3228 | 0.3911 | 0.5637 | 0.4173 | 0.6705 | 0.1075 | 0.4059 | 0.0927 | 0.3655 | 0.0222 | 0.3182 | 0.2271 | 0.4111 |
2.0328 | 6.0 | 642 | 1.5774 | 0.1806 | 0.4081 | 0.1359 | 0.0693 | 0.1831 | 0.2549 | 0.2169 | 0.3982 | 0.4191 | 0.2609 | 0.3582 | 0.5482 | 0.4468 | 0.6687 | 0.1109 | 0.3926 | 0.0985 | 0.3316 | 0.0344 | 0.3382 | 0.2123 | 0.3642 |
2.0328 | 7.0 | 749 | 1.4768 | 0.1974 | 0.4263 | 0.162 | 0.0814 | 0.1819 | 0.3263 | 0.2326 | 0.4013 | 0.4296 | 0.25 | 0.3439 | 0.5935 | 0.4733 | 0.6682 | 0.0929 | 0.3765 | 0.1278 | 0.3458 | 0.0289 | 0.3473 | 0.2641 | 0.4105 |
2.0328 | 8.0 | 856 | 1.4344 | 0.2161 | 0.4708 | 0.181 | 0.0699 | 0.2098 | 0.333 | 0.2364 | 0.4117 | 0.4391 | 0.3247 | 0.391 | 0.5775 | 0.5021 | 0.6795 | 0.1504 | 0.4412 | 0.1385 | 0.3209 | 0.0341 | 0.3745 | 0.2553 | 0.3795 |
2.0328 | 9.0 | 963 | 1.4459 | 0.2189 | 0.4586 | 0.1836 | 0.1123 | 0.2038 | 0.3322 | 0.2571 | 0.4138 | 0.4358 | 0.3249 | 0.3825 | 0.5595 | 0.4938 | 0.6756 | 0.1193 | 0.4191 | 0.1437 | 0.3401 | 0.0531 | 0.3509 | 0.2845 | 0.3932 |
1.4446 | 10.0 | 1070 | 1.3804 | 0.2384 | 0.5297 | 0.1935 | 0.0951 | 0.2296 | 0.3716 | 0.2606 | 0.44 | 0.4664 | 0.2763 | 0.4271 | 0.5924 | 0.5123 | 0.6835 | 0.164 | 0.4794 | 0.193 | 0.3881 | 0.0584 | 0.4018 | 0.2642 | 0.3789 |
1.4446 | 11.0 | 1177 | 1.3651 | 0.2451 | 0.532 | 0.191 | 0.144 | 0.2261 | 0.3733 | 0.2756 | 0.4496 | 0.4642 | 0.2983 | 0.3913 | 0.6093 | 0.5158 | 0.7091 | 0.1875 | 0.4956 | 0.1824 | 0.3616 | 0.0623 | 0.3655 | 0.2777 | 0.3895 |
1.4446 | 12.0 | 1284 | 1.3426 | 0.2526 | 0.5358 | 0.208 | 0.1033 | 0.2291 | 0.38 | 0.285 | 0.4553 | 0.4771 | 0.281 | 0.4179 | 0.6092 | 0.5401 | 0.692 | 0.2285 | 0.5235 | 0.1492 | 0.3463 | 0.0697 | 0.4164 | 0.2753 | 0.4074 |
1.4446 | 13.0 | 1391 | 1.3738 | 0.2444 | 0.5204 | 0.2107 | 0.09 | 0.224 | 0.3802 | 0.2751 | 0.4449 | 0.4625 | 0.3096 | 0.398 | 0.5982 | 0.52 | 0.6835 | 0.1954 | 0.5132 | 0.1721 | 0.3362 | 0.0596 | 0.3655 | 0.2749 | 0.4142 |
1.4446 | 14.0 | 1498 | 1.3362 | 0.2562 | 0.5391 | 0.2243 | 0.0838 | 0.2223 | 0.4115 | 0.2789 | 0.4514 | 0.4694 | 0.2657 | 0.4098 | 0.6113 | 0.5536 | 0.6994 | 0.1741 | 0.5 | 0.1999 | 0.3593 | 0.089 | 0.3982 | 0.2646 | 0.39 |
1.2339 | 15.0 | 1605 | 1.2863 | 0.274 | 0.5738 | 0.235 | 0.1041 | 0.2467 | 0.4314 | 0.2953 | 0.4712 | 0.4907 | 0.3404 | 0.4268 | 0.6344 | 0.5423 | 0.7063 | 0.2487 | 0.5147 | 0.1963 | 0.3825 | 0.1043 | 0.4382 | 0.2783 | 0.4121 |
1.2339 | 16.0 | 1712 | 1.2890 | 0.2834 | 0.5828 | 0.246 | 0.1031 | 0.2535 | 0.4322 | 0.2994 | 0.4643 | 0.4831 | 0.3513 | 0.4217 | 0.6215 | 0.5515 | 0.6909 | 0.2501 | 0.5103 | 0.2154 | 0.3689 | 0.1029 | 0.4418 | 0.2969 | 0.4037 |
1.2339 | 17.0 | 1819 | 1.3175 | 0.2706 | 0.5655 | 0.2381 | 0.096 | 0.2336 | 0.4086 | 0.2952 | 0.4623 | 0.4779 | 0.3177 | 0.4192 | 0.6109 | 0.5271 | 0.6903 | 0.2482 | 0.5044 | 0.1664 | 0.352 | 0.1075 | 0.4309 | 0.3035 | 0.4121 |
1.2339 | 18.0 | 1926 | 1.2626 | 0.2851 | 0.5718 | 0.2366 | 0.0902 | 0.2654 | 0.4276 | 0.3093 | 0.4791 | 0.4957 | 0.2848 | 0.443 | 0.6326 | 0.5663 | 0.7091 | 0.2394 | 0.5279 | 0.204 | 0.3701 | 0.1101 | 0.4509 | 0.3058 | 0.4205 |
1.0914 | 19.0 | 2033 | 1.2619 | 0.2947 | 0.6021 | 0.2419 | 0.1009 | 0.2725 | 0.4294 | 0.3038 | 0.4833 | 0.4971 | 0.3184 | 0.4445 | 0.6232 | 0.5687 | 0.7017 | 0.2691 | 0.5368 | 0.2063 | 0.365 | 0.1127 | 0.4455 | 0.3169 | 0.4368 |
1.0914 | 20.0 | 2140 | 1.2522 | 0.3037 | 0.6086 | 0.2784 | 0.1125 | 0.2678 | 0.4599 | 0.3166 | 0.4787 | 0.4927 | 0.3366 | 0.4307 | 0.6243 | 0.5613 | 0.692 | 0.2928 | 0.5279 | 0.2075 | 0.3695 | 0.1509 | 0.4382 | 0.306 | 0.4358 |
1.0914 | 21.0 | 2247 | 1.2523 | 0.3006 | 0.6162 | 0.2592 | 0.1084 | 0.263 | 0.4545 | 0.3158 | 0.4778 | 0.4933 | 0.3353 | 0.4233 | 0.6332 | 0.5636 | 0.6989 | 0.2845 | 0.5162 | 0.1984 | 0.3599 | 0.1427 | 0.4582 | 0.3139 | 0.4332 |
1.0914 | 22.0 | 2354 | 1.2415 | 0.3077 | 0.624 | 0.2611 | 0.1393 | 0.2779 | 0.4448 | 0.3182 | 0.4826 | 0.4931 | 0.3503 | 0.426 | 0.6303 | 0.5733 | 0.7063 | 0.2865 | 0.5324 | 0.2076 | 0.3559 | 0.151 | 0.4418 | 0.32 | 0.4289 |
1.0914 | 23.0 | 2461 | 1.2369 | 0.306 | 0.6127 | 0.28 | 0.1183 | 0.2778 | 0.4528 | 0.3185 | 0.4812 | 0.4912 | 0.3626 | 0.4312 | 0.6278 | 0.5671 | 0.7017 | 0.2834 | 0.5221 | 0.2051 | 0.3593 | 0.1536 | 0.4309 | 0.3208 | 0.4421 |
1.0025 | 24.0 | 2568 | 1.2379 | 0.3076 | 0.6168 | 0.2685 | 0.1043 | 0.2796 | 0.4559 | 0.3191 | 0.4815 | 0.4946 | 0.3321 | 0.4344 | 0.6313 | 0.5695 | 0.7091 | 0.2823 | 0.5221 | 0.2061 | 0.3644 | 0.1542 | 0.4327 | 0.326 | 0.4447 |
1.0025 | 25.0 | 2675 | 1.2307 | 0.3139 | 0.6266 | 0.2715 | 0.1157 | 0.2888 | 0.4601 | 0.3206 | 0.4855 | 0.5023 | 0.3389 | 0.4441 | 0.6388 | 0.5695 | 0.7091 | 0.2934 | 0.525 | 0.2123 | 0.3678 | 0.1651 | 0.4545 | 0.3293 | 0.4553 |
1.0025 | 26.0 | 2782 | 1.2233 | 0.3133 | 0.6269 | 0.271 | 0.109 | 0.2844 | 0.4571 | 0.3171 | 0.4862 | 0.5019 | 0.3415 | 0.4391 | 0.6377 | 0.5774 | 0.7142 | 0.2981 | 0.5279 | 0.2093 | 0.3633 | 0.1619 | 0.46 | 0.32 | 0.4442 |
1.0025 | 27.0 | 2889 | 1.2248 | 0.313 | 0.6267 | 0.2686 | 0.1104 | 0.2867 | 0.4571 | 0.3185 | 0.4878 | 0.5026 | 0.361 | 0.44 | 0.637 | 0.5724 | 0.717 | 0.301 | 0.5338 | 0.2043 | 0.365 | 0.1653 | 0.4509 | 0.322 | 0.4463 |
1.0025 | 28.0 | 2996 | 1.2249 | 0.311 | 0.6268 | 0.2671 | 0.1071 | 0.2842 | 0.4533 | 0.3186 | 0.4887 | 0.5021 | 0.3363 | 0.4395 | 0.6376 | 0.5732 | 0.7176 | 0.2967 | 0.5265 | 0.201 | 0.3588 | 0.1609 | 0.4564 | 0.3234 | 0.4511 |
0.9487 | 29.0 | 3103 | 1.2225 | 0.3125 | 0.6304 | 0.2661 | 0.1045 | 0.2847 | 0.4557 | 0.3199 | 0.4891 | 0.503 | 0.3324 | 0.4407 | 0.639 | 0.5755 | 0.7182 | 0.3001 | 0.5338 | 0.2089 | 0.365 | 0.1574 | 0.4527 | 0.3208 | 0.4453 |
0.9487 | 30.0 | 3210 | 1.2223 | 0.3117 | 0.6293 | 0.2667 | 0.106 | 0.2851 | 0.4545 | 0.3195 | 0.4877 | 0.5017 | 0.3298 | 0.4402 | 0.6375 | 0.5742 | 0.7165 | 0.2986 | 0.5309 | 0.2081 | 0.3644 | 0.1561 | 0.4509 | 0.3214 | 0.4458 |
Framework versions
- Transformers 4.42.3
- Pytorch 2.1.2
- Datasets 2.20.0
- Tokenizers 0.19.1
- Downloads last month
- 60
Inference Providers
NEW
This model is not currently available via any of the supported third-party Inference Providers, and
the model is not deployed on the HF Inference API.
Model tree for Leotrim/detr_finetuned_cppe5
Base model
microsoft/conditional-detr-resnet-50