ArrayDice commited on
Commit
0b83efb
·
verified ·
1 Parent(s): 7f33e40

End of training

Browse files
Files changed (2) hide show
  1. README.md +110 -0
  2. model.safetensors +1 -1
README.md ADDED
@@ -0,0 +1,110 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ license: apache-2.0
3
+ base_model: facebook/detr-resnet-50
4
+ tags:
5
+ - generated_from_trainer
6
+ model-index:
7
+ - name: Vehicle_Detection_Model
8
+ results: []
9
+ ---
10
+
11
+ <!-- This model card has been generated automatically according to the information the Trainer had access to. You
12
+ should probably proofread and complete it, then remove this comment. -->
13
+
14
+ # Vehicle_Detection_Model
15
+
16
+ This model is a fine-tuned version of [facebook/detr-resnet-50](https://huggingface.co/facebook/detr-resnet-50) on the None dataset.
17
+ It achieves the following results on the evaluation set:
18
+ - Loss: 0.7220
19
+ - Map: 0.0875
20
+ - Map 50: 0.1634
21
+ - Map 75: 0.084
22
+ - Map Small: 0.355
23
+ - Map Medium: 0.1423
24
+ - Map Large: 0.0499
25
+ - Mar 1: 0.1462
26
+ - Mar 10: 0.2602
27
+ - Mar 100: 0.2709
28
+ - Mar Small: 0.5
29
+ - Mar Medium: 0.4013
30
+ - Mar Large: 0.3
31
+ - Map Camping car: 0.0039
32
+ - Mar 100 Camping car: 0.35
33
+ - Map Car: 0.4971
34
+ - Mar 100 Car: 0.6256
35
+ - Map Other: 0.0
36
+ - Mar 100 Other: 0.0
37
+ - Map Pickup: 0.0239
38
+ - Mar 100 Pickup: 0.65
39
+ - Map Truck: 0.0
40
+ - Mar 100 Truck: 0.0
41
+ - Map Van: 0.0
42
+ - Mar 100 Van: 0.0
43
+
44
+ ## Model description
45
+
46
+ More information needed
47
+
48
+ ## Intended uses & limitations
49
+
50
+ More information needed
51
+
52
+ ## Training and evaluation data
53
+
54
+ More information needed
55
+
56
+ ## Training procedure
57
+
58
+ ### Training hyperparameters
59
+
60
+ The following hyperparameters were used during training:
61
+ - learning_rate: 5e-05
62
+ - train_batch_size: 8
63
+ - eval_batch_size: 8
64
+ - seed: 42
65
+ - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
66
+ - lr_scheduler_type: cosine
67
+ - num_epochs: 30
68
+
69
+ ### Training results
70
+
71
+ | Training Loss | Epoch | Step | Validation Loss | Map | Map 50 | Map 75 | Map Small | Map Medium | Map Large | Mar 1 | Mar 10 | Mar 100 | Mar Small | Mar Medium | Mar Large | Map Camping car | Mar 100 Camping car | Map Car | Mar 100 Car | Map Other | Mar 100 Other | Map Pickup | Mar 100 Pickup | Map Truck | Mar 100 Truck | Map Van | Mar 100 Van |
72
+ |:-------------:|:-----:|:----:|:---------------:|:------:|:------:|:------:|:---------:|:----------:|:---------:|:------:|:------:|:-------:|:---------:|:----------:|:---------:|:---------------:|:-------------------:|:-------:|:-----------:|:---------:|:-------------:|:----------:|:--------------:|:---------:|:-------------:|:-------:|:-----------:|
73
+ | No log | 1.0 | 232 | 1.3186 | 0.0125 | 0.0292 | 0.0085 | 0.0059 | 0.0197 | 0.004 | 0.0182 | 0.0467 | 0.0903 | 0.0556 | 0.1396 | 0.1 | 0.0 | 0.0 | 0.0751 | 0.5416 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
74
+ | No log | 2.0 | 464 | 1.0802 | 0.0375 | 0.0832 | 0.0283 | 0.263 | 0.0585 | 0.0102 | 0.0254 | 0.0734 | 0.0865 | 0.2667 | 0.1317 | 0.15 | 0.0 | 0.0 | 0.2249 | 0.5189 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
75
+ | 1.4746 | 3.0 | 696 | 0.9804 | 0.0608 | 0.125 | 0.0483 | 0.2948 | 0.0938 | 0.075 | 0.0305 | 0.0821 | 0.0903 | 0.3556 | 0.1372 | 0.075 | 0.0 | 0.0 | 0.3646 | 0.5416 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
76
+ | 1.4746 | 4.0 | 928 | 0.9139 | 0.0712 | 0.1426 | 0.0577 | 0.2232 | 0.1091 | 0.125 | 0.0315 | 0.085 | 0.0959 | 0.3333 | 0.1459 | 0.125 | 0.0 | 0.0 | 0.4274 | 0.5754 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
77
+ | 1.0134 | 5.0 | 1160 | 0.9581 | 0.0637 | 0.1367 | 0.0426 | 0.0804 | 0.0993 | 0.075 | 0.0289 | 0.0771 | 0.0846 | 0.2556 | 0.1292 | 0.075 | 0.0 | 0.0 | 0.3823 | 0.5078 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
78
+ | 1.0134 | 6.0 | 1392 | 0.8830 | 0.0743 | 0.1476 | 0.0645 | 0.2171 | 0.1139 | 0.075 | 0.0345 | 0.0862 | 0.0967 | 0.3222 | 0.1474 | 0.075 | 0.0 | 0.0 | 0.4456 | 0.5801 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
79
+ | 0.9133 | 7.0 | 1624 | 0.8645 | 0.0716 | 0.147 | 0.0571 | 0.2156 | 0.1099 | 0.0752 | 0.0329 | 0.0853 | 0.0966 | 0.3111 | 0.147 | 0.175 | 0.0 | 0.0 | 0.4296 | 0.5797 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
80
+ | 0.9133 | 8.0 | 1856 | 0.8776 | 0.0676 | 0.1478 | 0.0445 | 0.2225 | 0.1032 | 0.1254 | 0.0317 | 0.0811 | 0.0934 | 0.3333 | 0.1417 | 0.2 | 0.0 | 0.0 | 0.4056 | 0.5601 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
81
+ | 0.8617 | 9.0 | 2088 | 0.8556 | 0.0754 | 0.1506 | 0.0638 | 0.252 | 0.1153 | 0.1254 | 0.0323 | 0.0881 | 0.0999 | 0.3667 | 0.1517 | 0.2 | 0.0 | 0.0 | 0.4525 | 0.5996 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 | 0.0 |
82
+ | 0.8617 | 10.0 | 2320 | 0.7552 | 0.0821 | 0.1528 | 0.0803 | 0.361 | 0.1248 | 0.1257 | 0.0347 | 0.1119 | 0.1312 | 0.5222 | 0.1747 | 0.2625 | 0.0 | 0.0 | 0.4921 | 0.6206 | 0.0 | 0.0 | 0.0004 | 0.1667 | 0.0 | 0.0 | 0.0 | 0.0 |
83
+ | 0.8182 | 11.0 | 2552 | 0.8211 | 0.0741 | 0.1525 | 0.0627 | 0.3014 | 0.114 | 0.0503 | 0.0381 | 0.1098 | 0.1544 | 0.4222 | 0.2641 | 0.125 | 0.0001 | 0.1 | 0.4422 | 0.5762 | 0.0 | 0.0 | 0.002 | 0.25 | 0.0 | 0.0 | 0.0 | 0.0 |
84
+ | 0.8182 | 12.0 | 2784 | 0.8140 | 0.0734 | 0.1486 | 0.056 | 0.214 | 0.1129 | 0.0506 | 0.0451 | 0.1066 | 0.1184 | 0.3778 | 0.1941 | 0.125 | 0.0 | 0.0 | 0.4382 | 0.594 | 0.0 | 0.0 | 0.0024 | 0.1167 | 0.0 | 0.0 | 0.0 | 0.0 |
85
+ | 0.8079 | 13.0 | 3016 | 0.7473 | 0.0848 | 0.1559 | 0.0789 | 0.2794 | 0.1307 | 0.0779 | 0.078 | 0.16 | 0.1727 | 0.5 | 0.2668 | 0.1625 | 0.0 | 0.0 | 0.4983 | 0.6363 | 0.0 | 0.0 | 0.0103 | 0.4 | 0.0 | 0.0 | 0.0 | 0.0 |
86
+ | 0.8079 | 14.0 | 3248 | 0.8514 | 0.0706 | 0.1541 | 0.0485 | 0.3034 | 0.1071 | 0.1046 | 0.079 | 0.1726 | 0.1862 | 0.4222 | 0.2556 | 0.2875 | 0.0 | 0.0 | 0.4149 | 0.5669 | 0.0 | 0.0 | 0.0089 | 0.55 | 0.0 | 0.0 | 0.0 | 0.0 |
87
+ | 0.8079 | 15.0 | 3480 | 0.7615 | 0.0814 | 0.1579 | 0.0709 | 0.3673 | 0.1229 | 0.0691 | 0.0929 | 0.161 | 0.1719 | 0.4444 | 0.2177 | 0.3125 | 0.0 | 0.0 | 0.4774 | 0.6146 | 0.0 | 0.0 | 0.0109 | 0.4167 | 0.0 | 0.0 | 0.0 | 0.0 |
88
+ | 0.7672 | 16.0 | 3712 | 0.7819 | 0.0786 | 0.1555 | 0.0732 | 0.2782 | 0.1182 | 0.1589 | 0.103 | 0.1757 | 0.1878 | 0.3556 | 0.2629 | 0.325 | 0.0 | 0.0 | 0.4557 | 0.5936 | 0.0 | 0.0 | 0.0159 | 0.5333 | 0.0 | 0.0 | 0.0 | 0.0 |
89
+ | 0.7672 | 17.0 | 3944 | 0.7723 | 0.0807 | 0.1551 | 0.0757 | 0.3302 | 0.1227 | 0.0687 | 0.0812 | 0.1762 | 0.1939 | 0.4 | 0.274 | 0.3 | 0.0 | 0.0 | 0.4724 | 0.6135 | 0.0 | 0.0 | 0.0121 | 0.55 | 0.0 | 0.0 | 0.0 | 0.0 |
90
+ | 0.744 | 18.0 | 4176 | 0.7838 | 0.0784 | 0.1535 | 0.0709 | 0.3594 | 0.1201 | 0.0564 | 0.0865 | 0.1865 | 0.2497 | 0.4 | 0.3637 | 0.325 | 0.0009 | 0.3 | 0.4503 | 0.5982 | 0.0 | 0.0 | 0.0191 | 0.6 | 0.0 | 0.0 | 0.0 | 0.0 |
91
+ | 0.744 | 19.0 | 4408 | 0.7714 | 0.0763 | 0.1552 | 0.0625 | 0.2732 | 0.116 | 0.0675 | 0.0605 | 0.1696 | 0.1824 | 0.4111 | 0.2502 | 0.3 | 0.0 | 0.0 | 0.4461 | 0.5943 | 0.0 | 0.0 | 0.0115 | 0.5 | 0.0 | 0.0 | 0.0 | 0.0 |
92
+ | 0.7219 | 20.0 | 4640 | 0.7403 | 0.0808 | 0.1543 | 0.0686 | 0.2968 | 0.1251 | 0.0665 | 0.0984 | 0.1889 | 0.1997 | 0.5111 | 0.2859 | 0.3 | 0.0 | 0.0 | 0.4738 | 0.6146 | 0.0 | 0.0 | 0.0109 | 0.5833 | 0.0 | 0.0 | 0.0 | 0.0 |
93
+ | 0.7219 | 21.0 | 4872 | 0.7421 | 0.0839 | 0.1599 | 0.0803 | 0.2862 | 0.2166 | 0.0451 | 0.1396 | 0.2437 | 0.257 | 0.5 | 0.3782 | 0.275 | 0.0123 | 0.4 | 0.4792 | 0.6089 | 0.0 | 0.0 | 0.0118 | 0.5333 | 0.0 | 0.0 | 0.0 | 0.0 |
94
+ | 0.6955 | 22.0 | 5104 | 0.7459 | 0.0844 | 0.1619 | 0.0753 | 0.3116 | 0.1475 | 0.0462 | 0.1395 | 0.2253 | 0.2342 | 0.4333 | 0.3404 | 0.2875 | 0.0068 | 0.25 | 0.4815 | 0.6053 | 0.0 | 0.0 | 0.018 | 0.55 | 0.0 | 0.0 | 0.0 | 0.0 |
95
+ | 0.6955 | 23.0 | 5336 | 0.7439 | 0.0844 | 0.1594 | 0.0782 | 0.3246 | 0.1335 | 0.0682 | 0.1559 | 0.2468 | 0.2567 | 0.4667 | 0.378 | 0.3 | 0.0031 | 0.3 | 0.4854 | 0.6071 | 0.0 | 0.0 | 0.018 | 0.6333 | 0.0 | 0.0 | 0.0 | 0.0 |
96
+ | 0.6836 | 24.0 | 5568 | 0.7263 | 0.0879 | 0.1619 | 0.0854 | 0.3509 | 0.1397 | 0.0597 | 0.1718 | 0.2499 | 0.2589 | 0.4333 | 0.3818 | 0.275 | 0.0027 | 0.3 | 0.4998 | 0.6203 | 0.0 | 0.0 | 0.025 | 0.6333 | 0.0 | 0.0 | 0.0 | 0.0 |
97
+ | 0.6836 | 25.0 | 5800 | 0.7146 | 0.0881 | 0.1632 | 0.087 | 0.3606 | 0.1412 | 0.0714 | 0.1518 | 0.2466 | 0.2572 | 0.4667 | 0.3768 | 0.3 | 0.0032 | 0.3 | 0.4988 | 0.6263 | 0.0 | 0.0 | 0.0267 | 0.6167 | 0.0 | 0.0 | 0.0 | 0.0 |
98
+ | 0.6671 | 26.0 | 6032 | 0.7180 | 0.0915 | 0.1678 | 0.0918 | 0.3652 | 0.2262 | 0.0483 | 0.163 | 0.2506 | 0.2672 | 0.5 | 0.3979 | 0.2875 | 0.0186 | 0.35 | 0.5069 | 0.6367 | 0.0 | 0.0 | 0.0235 | 0.6167 | 0.0 | 0.0 | 0.0 | 0.0 |
99
+ | 0.6671 | 27.0 | 6264 | 0.7219 | 0.0907 | 0.1678 | 0.0888 | 0.3596 | 0.2122 | 0.0492 | 0.1375 | 0.26 | 0.2709 | 0.5 | 0.4012 | 0.3 | 0.0184 | 0.35 | 0.5014 | 0.6253 | 0.0 | 0.0 | 0.0243 | 0.65 | 0.0 | 0.0 | 0.0 | 0.0 |
100
+ | 0.6671 | 28.0 | 6496 | 0.7177 | 0.0882 | 0.1642 | 0.0866 | 0.3625 | 0.1439 | 0.0505 | 0.1463 | 0.2603 | 0.2715 | 0.5111 | 0.4021 | 0.3 | 0.0038 | 0.35 | 0.4996 | 0.6288 | 0.0 | 0.0 | 0.0256 | 0.65 | 0.0 | 0.0 | 0.0 | 0.0 |
101
+ | 0.656 | 29.0 | 6728 | 0.7257 | 0.0873 | 0.1634 | 0.084 | 0.355 | 0.1417 | 0.0497 | 0.1433 | 0.2572 | 0.2708 | 0.5 | 0.401 | 0.3 | 0.0039 | 0.35 | 0.4962 | 0.6246 | 0.0 | 0.0 | 0.0237 | 0.65 | 0.0 | 0.0 | 0.0 | 0.0 |
102
+ | 0.656 | 30.0 | 6960 | 0.7220 | 0.0875 | 0.1634 | 0.084 | 0.355 | 0.1423 | 0.0499 | 0.1462 | 0.2602 | 0.2709 | 0.5 | 0.4013 | 0.3 | 0.0039 | 0.35 | 0.4971 | 0.6256 | 0.0 | 0.0 | 0.0239 | 0.65 | 0.0 | 0.0 | 0.0 | 0.0 |
103
+
104
+
105
+ ### Framework versions
106
+
107
+ - Transformers 4.41.2
108
+ - Pytorch 2.3.0+cu121
109
+ - Datasets 2.20.0
110
+ - Tokenizers 0.19.1
model.safetensors CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:3b892c3c1bda3aaca55e52bb0ec01374b723a659c2e9a2b20b452c7ae680dc4d
3
  size 166505112
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:1e45faf0f071fe3c5992867f6904f8273144becdc068ff6928c9093947239c93
3
  size 166505112