Models
stringlengths 5
34
| Data Source
stringclasses 3
values | Model Size(B)
stringlengths 1
5
| Overall
float64 0.11
0.84
| Biology
stringlengths 1
6
| Business
stringlengths 1
6
| Chemistry
stringlengths 1
6
| Computer Science
stringlengths 1
6
| Economics
stringlengths 1
6
| Engineering
stringlengths 1
6
| Health
stringlengths 1
6
| History
stringlengths 1
6
| Law
stringlengths 1
6
| Math
stringlengths 1
6
| Philosophy
stringlengths 1
6
| Physics
stringlengths 1
6
| Psychology
stringlengths 1
6
| Other
stringlengths 1
6
⌀ |
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
GPT-4o (2024-05-13) | TIGER-Lab | unk | 0.7255 | 0.8675 | 0.7858 | 0.7393 | 0.7829 | 0.808 | 0.55 | 0.7212 | 0.7007 | 0.5104 | 0.7609 | 0.7014 | 0.7467 | 0.7919 | 0.7748 |
Gemini-1.5-Pro | Self-Reported | unk | 0.6903 | 0.8466 | 0.7288 | 0.7032 | 0.7293 | 0.7844 | 0.4871 | 0.7274 | 0.6562 | 0.5077 | 0.7276 | 0.6172 | 0.7036 | 0.7720 | 0.7251 |
Claude-3-Opus | TIGER-Lab | unk | 0.6845 | 0.8507 | 0.7338 | 0.693 | 0.6902 | 0.798 | 0.484 | 0.6845 | 0.6141 | 0.5349 | 0.6957 | 0.6352 | 0.6966 | 0.7631 | 0.6991 |
Gemini-1.5-Flash | TIGER-Lab | unk | 0.5912 | 0.8131 | 0.667 | 0.613 | 0.5951 | 0.6943 | 0.4416 | 0.6039 | 0.538 | 0.3732 | 0.5958 | 0.4949 | 0.612 | 0.7005 | 0.58 |
Llama-3-70B-Instruct | TIGER-Lab | 70 | 0.562 | 0.7812 | 0.6018 | 0.4681 | 0.6053 | 0.6841 | 0.4362 | 0.6533 | 0.5692 | 0.3991 | 0.5402 | 0.5499 | 0.4962 | 0.7017 | 0.5924 |
Claude-3-Sonnet | TIGER-Lab | unk | 0.568 | 0.768 | 0.657 | 0.5291 | 0.59 | 0.709 | 0.4045 | 0.6332 | 0.5721 | 0.427 | 0.49 | 0.513 | 0.5311 | 0.7218 | 0.594 |
Deepseek-V2-Chat | TIGER-Lab | 236 | 0.5481 | 0.6625 | 0.6375 | 0.5415 | 0.5171 | 0.6363 | 0.3189 | 0.5825 | 0.4528 | 0.4064 | 0.5366 | 0.5492 | 0.5396 | 0.6621 | 0.6299 |
Llama-3-70B | TIGER-Lab | 70 | 0.5278 | 0.749 | 0.4994 | 0.417 | 0.5512 | 0.6528 | 0.3498 | 0.6174 | 0.5774 | 0.3497 | 0.4967 | 0.5691 | 0.4981 | 0.7143 | 0.5942 |
Llama-3-8B-Instruct | TIGER-Lab | 8 | 0.4098 | 0.6653 | 0.4043 | 0.28 | 0.4244 | 0.5355 | 0.3127 | 0.4902 | 0.4226 | 0.2652 | 0.3605 | 0.4048 | 0.3441 | 0.594 | 0.46 |
Llama-3-8B | TIGER-Lab | 8 | 0.3536 | 0.5649 | 0.3207 | 0.2482 | 0.3366 | 0.4668 | 0.2549 | 0.4328 | 0.3622 | 0.1962 | 0.3042 | 0.4048 | 0.3141 | 0.5326 | 0.4145 |
Yi-1.5-34B-Chat | TIGER-Lab | 34 | 0.5229 | 0.7141 | 0.5843 | 0.4753 | 0.539 | 0.6457 | 0.3437 | 0.5819 | 0.5276 | 0.3479 | 0.5618 | 0.4629 | 0.4935 | 0.6429 | 0.5162 |
Yi-1.5-9B-Chat | TIGER-Lab | 9 | 0.4595 | 0.6667 | 0.5425 | 0.3949 | 0.5 | 0.6019 | 0.3323 | 0.4352 | 0.4094 | 0.2661 | 0.5248 | 0.4008 | 0.4142 | 0.594 | 0.4491 |
Yi-1.5-6B-Chat | TIGER-Lab | 6 | 0.3823 | 0.5746 | 0.4766 | 0.3074 | 0.4366 | 0.5273 | 0.2683 | 0.3362 | 0.3176 | 0.2198 | 0.4145 | 0.3327 | 0.3564 | 0.5013 | 0.382 |
Mixtral-8x7B-Instruct-v0.1 | TIGER-Lab | 56 | 0.4327 | 0.6764 | 0.4119 | 0.2756 | 0.4439 | 0.5581 | 0.2921 | 0.5049 | 0.4462 | 0.3206 | 0.3634 | 0.4729 | 0.3988 | 0.6341 | 0.4989 |
Mixtral-8x7B-v0.1 | TIGER-Lab | 56 | 0.4103 | 0.6206 | 0.3752 | 0.288 | 0.4683 | 0.5071 | 0.2786 | 0.467 | 0.4751 | 0.2707 | 0.3412 | 0.4629 | 0.3718 | 0.6103 | 0.4946 |
Qwen1.5-7B-Chat | TIGER-Lab | 7 | 0.2906 | 0.4561 | 0.2953 | 0.1943 | 0.3195 | 0.41 | 0.1878 | 0.2714 | 0.3018 | 0.198 | 0.3064 | 0.2826 | 0.2148 | 0.4524 | 0.3323 |
Qwen1.5-14B-Chat | TIGER-Lab | 14 | 0.3802 | 0.6151 | 0.3942 | 0.2615 | 0.3683 | 0.5142 | 0.2817 | 0.4218 | 0.3753 | 0.2489 | 0.3886 | 0.3527 | 0.3156 | 0.5251 | 0.4069 |
Llama-2-13B | TIGER-Lab | 13 | 0.2534 | 0.4045 | 0.2484 | 0.1519 | 0.2293 | 0.3353 | 0.2023 | 0.3081 | 0.2835 | 0.1599 | 0.1651 | 0.3046 | 0.1986 | 0.4261 | 0.3344 |
Llama-2-7B | TIGER-Lab | 7 | 0.2032 | 0.325 | 0.1876 | 0.1511 | 0.1829 | 0.3164 | 0.1496 | 0.2298 | 0.1942 | 0.1662 | 0.1332 | 0.2204 | 0.1694 | 0.317 | 0.2143 |
c4ai-command-r-v01 | TIGER-Lab | 35 | 0.379 | 0.5509 | 0.3739 | 0.2226 | 0.3829 | 0.5118 | 0.2477 | 0.4878 | 0.4751 | 0.3397 | 0.2628 | 0.4289 | 0.2833 | 0.5852 | 0.4665 |
Yi-6B-Chat | TIGER-Lab | 6 | 0.2884 | 0.477 | 0.2826 | 0.1661 | 0.2659 | 0.3969 | 0.1899 | 0.3521 | 0.315 | 0.2162 | 0.2124 | 0.3367 | 0.2094 | 0.4912 | 0.3506 |
Yi-large | TIGER-Lab | 150 | 0.5809 | 0.6987 | 0.6413 | 0.6166 | 0.6341 | 0.6813 | 0.4541 | 0.6443 | 0.4961 | 0.3624 | 0.6481 | 0.5531 | 0.5704 | 0.5063 | 0.6472 |
GPT-4-Turbo | TIGER-Lab | unk | 0.6371 | 0.8243 | 0.673 | 0.5592 | 0.6854 | 0.7476 | 0.3591 | 0.7078 | 0.6772 | 0.5123 | 0.6277 | 0.6433 | 0.6097 | 0.7832 | 0.7186 |
MAmmoTH2-7B-Plus | TIGER-Lab | 7 | 0.4085 | 0.615 | 0.4588 | 0.3604 | 0.3805 | 0.5722 | 0.2363 | 0.4009 | 0.3674 | 0.2298 | 0.4574 | 0.3346 | 0.396 | 0.5513 | 0.408 |
MAmmoTH2-8B-Plus | TIGER-Lab | 8 | 0.4335 | 0.6429 | 0.4765 | 0.3904 | 0.4317 | 0.5734 | 0.2631 | 0.4132 | 0.4461 | 0.2479 | 0.4766 | 0.4208 | 0.398 | 0.5563 | 0.461 |
MAmmoTH2-8x7B-Plus | TIGER-Lab | 56 | 0.504 | 0.7183 | 0.5615 | 0.4205 | 0.4854 | 0.6398 | 0.3395 | 0.5538 | 0.5092 | 0.3551 | 0.5026 | 0.481 | 0.4565 | 0.6378 | 0.5444 |
Gemma-7B | TIGER-Lab | 7 | 0.3373 | 0.5649 | 0.3333 | 0.2624 | 0.3659 | 0.4242 | 0.227 | 0.3716 | 0.3675 | 0.2171 | 0.2509 | 0.3908 | 0.2756 | 0.5175 | 0.4091 |
Qwen1.5-72B-Chat | TIGER-Lab | 72 | 0.5264 | 0.7280 | 0.5792 | 0.4196 | 0.5683 | 0.6540 | 0.3664 | 0.5954 | 0.5591 | 0.3851 | 0.5233 | 0.5150 | 0.4419 | 0.6767 | 0.5823 |
Qwen1.5-110B | TIGER-Lab | 110 | 0.4993 | 0.7476 | 0.4664 | 0.3746 | 0.5122 | 0.6185 | 0.3529 | 0.5868 | 0.5407 | 0.3506 | 0.5041 | 0.5311 | 0.4142 | 0.6629 | 0.5639 |
Mistral-7B-Instruct-v0.2 | TIGER-Lab | 7 | 0.3084 | 0.4533 | 0.289 | 0.1767 | 0.3195 | 0.4633 | 0.1971 | 0.3875 | 0.3438 | 0.218 | 0.2243 | 0.3387 | 0.2548 | 0.51 | 0.3755 |
Mistral-7B-v0.1 | TIGER-Lab | 7 | 0.3088 | 0.4965 | 0.2852 | 0.1846 | 0.339 | 0.4028 | 0.2239 | 0.3863 | 0.3255 | 0.2071 | 0.2354 | 0.3687 | 0.2479 | 0.4887 | 0.3755 |
Mistral-7B-v0.2 | TIGER-Lab | 7 | 0.3043 | 0.484 | 0.2611 | 0.1829 | 0.3073 | 0.3791 | 0.2415 | 0.3704 | 0.3228 | 0.1989 | 0.2287 | 0.3547 | 0.254 | 0.4937 | 0.3961 |
Mistral-7B-Instruct-v0.1 | TIGER-Lab | 7 | 0.2575 | 0.4993 | 0.1914 | 0.1687 | 0.3049 | 0.3412 | 0.1651 | 0.2885 | 0.2835 | 0.1753 | 0.1821 | 0.2705 | 0.2163 | 0.4336 | 0.303 |
Yi-34B | TIGER-Lab | 34 | 0.4303 | 0.6527 | 0.4005 | 0.2650 | 0.4366 | 0.5569 | 0.3261 | 0.5379 | 0.5197 | 0.3270 | 0.3175 | 0.4770 | 0.3503 | 0.6253 | 0.5509 |
Llama-2-70B | TIGER-Lab | 70 | 0.3753 | 0.5802 | 0.3853 | 0.2217 | 0.4098 | 0.5059 | 0.2353 | 0.4352 | 0.4593 | 0.2861 | 0.2679 | 0.4629 | 0.2818 | 0.5902 | 0.4827 |
Yi-6B | TIGER-Lab | 6 | 0.2651 | 0.4226 | 0.2864 | 0.1484 | 0.2732 | 0.3578 | 0.1796 | 0.3166 | 0.2940 | 0.1953 | 0.1902 | 0.3186 | 0.1832 | 0.4286 | 0.3496 |
Llemma-7B | TIGER-Lab | 7 | 0.2345 | 0.3724 | 0.251 | 0.1829 | 0.2659 | 0.3009 | 0.2384 | 0.2139 | 0.1522 | 0.148 | 0.2161 | 0.1964 | 0.2571 | 0.2957 | 0.2165 |
DeepseekMath-7B-Instruct | TIGER-Lab | 7 | 0.353 | 0.46 | 0.4233 | 0.4108 | 0.3902 | 0.4822 | 0.3364 | 0.2506 | 0.1522 | 0.1571 | 0.4278 | 0.2705 | 0.3918 | 0.3947 | 0.2803 |
OpenChat-3.5-8B | TIGER-Lab | 8 | 0.3724 | 0.5578 | 0.415 | 0.2464 | 0.4048 | 0.4869 | 0.2693 | 0.4132 | 0.3989 | 0.2461 | 0.3619 | 0.3847 | 0.3048 | 0.5451 | 0.3928 |
Gemma-2B | TIGER-Lab | 2 | 0.1585 | 0.2482 | 0.1457 | 0.1378 | 0.1414 | 0.1753 | 0.1269 | 0.177 | 0.154 | 0.123 | 0.163 | 0.1482 | 0.1563 | 0.1608 | 0.1817 |
Zephyr-7B-Beta | TIGER-Lab | 7 | 0.3297 | 0.5509 | 0.2775 | 0.2367 | 0.3756 | 0.4573 | 0.2394 | 0.396 | 0.3202 | 0.2198 | 0.2361 | 0.3896 | 0.3567 | 0.2817 | 0.5050 |
Neo-7B | TIGER-Lab | 7 | 0.2585 | 0.4253 | 0.2547 | 0.1819 | 0.2414 | 0.3578 | 0.2136 | 0.2652 | 0.2519 | 0.1589 | 0.2509 | 0.2694 | 0.2585 | 0.2347 | 0.3521 |
Staring-7B | TIGER-Lab | 7 | 0.379 | 0.5871 | 0.3688 | 0.2676 | 0.4048 | 0.4810 | 0.2703 | 0.4425 | 0.4356 | 0.2470 | 0.3486 | 0.4231 | 0.3847 | 0.3248 | 0.5463 |
InternMath-7B-Plus | TIGER-Lab | 7 | 0.335 | 0.4714 | 0.3751 | 0.3436 | 0.3975 | 0.4135 | 0.2817 | 0.2628 | 0.1916 | 0.1444 | 0.4826 | 0.2435 | 0.2284 | 0.3826 | 0.3634 |
InternMath-20B-Plus | TIGER-Lab | 20 | 0.371 | 0.5216 | 0.4752 | 0.4028 | 0.446 | 0.4312 | 0.3044 | 0.2334 | 0.2047 | 0.1516 | 0.5610 | 0.2683 | 0.2404 | 0.4234 | 0.3834 |
Llama3-Smaug-8B | TIGER-Lab | 8 | 0.3693 | 0.6220 | 0.3738 | 0.2305 | 0.3658 | 0.4917 | 0.1981 | 0.4327 | 0.4199 | 0.2652 | 0.3316 | 0.4502 | 0.3727 | 0.2856 | 0.5739 |
Phi3-medium-128k | TIGER-Lab | 14 | 0.5191 | 0.7336 | 0.5640 | 0.4382 | 0.5171 | 0.6647 | 0.3437 | 0.5856 | 0.5381 | 0.3597 | 0.4989 | 0.4910 | 0.4519 | 0.7093 | 0.5639 |
Phi3-medium-4k | TIGER-Lab | 14 | 0.557 | 0.7587 | 0.6160 | 0.4991 | 0.5415 | 0.7038 | 0.3787 | 0.6357 | 0.5722 | 0.3833 | 0.5218 | 0.5511 | 0.4935 | 0.7343 | 0.6028 |
Phi3-mini-128k | TIGER-Lab | 3.8 | 0.4386 | 0.6695 | 0.4892 | 0.3763 | 0.4146 | 0.5960 | 0.2570 | 0.4804 | 0.4094 | 0.2698 | 0.4145 | 0.4529 | 0.3803 | 0.6491 | 0.4535 |
Phi3-mini-4k | TIGER-Lab | 3.8 | 0.4566 | 0.7015 | 0.5044 | 0.3896 | 0.4463 | 0.6055 | 0.2869 | 0.5024 | 0.4147 | 0.2852 | 0.4182 | 0.4449 | 0.4095 | 0.6516 | 0.4957 |
Neo-7B-Instruct | TIGER-Lab | 7 | 0.2874 | 0.5097 | 0.3229 | 0.2266 | 0.2919 | 0.3577 | 0.1914 | 0.2509 | 0.2821 | 0.1795 | 0.3545 | 0.2603 | 0.2370 | 0.3619 | 0.2803 |
GLM-4-9B-Chat | TIGER-Lab | 9 | 0.4801 | 0.7015 | 0.5070 | 0.4117 | 0.4976 | 0.6232 | 0.3106 | 0.5379 | 0.5223 | 0.3006 | 0.5107 | 0.4429 | 0.4042 | 0.6165 | 0.5173 |
GLM-4-9B | TIGER-Lab | 9 | 0.4792 | 0.7099 | 0.4778 | 0.3719 | 0.4927 | 0.6552 | 0.3323 | 0.5440 | 0.5381 | 0.3551 | 0.4323 | 0.4709 | 0.3865 | 0.6541 | 0.5390 |
Higgs-Llama-3-70B | Self-Reported | 70 | 0.6316 | 0.8354 | 0.6743 | 0.6034 | 0.6902 | 0.7512 | 0.4737 | 0.6687 | 0.6404 | 0.4432 | 0.6321 | 0.5591 | 0.5989 | 0.7619 | 0.6613 |
Qwen2-72B-Chat | TIGER-Lab | 72 | 0.6438 | 0.8107 | 0.6996 | 0.5989 | 0.6488 | 0.7589 | 0.6724 | 0.4603 | 0.6781 | 0.4587 | 0.7098 | 0.5892 | 0.6089 | 0.7669 | 0.6652 |
Qwen2-72B-32k | TIGER-Lab | 72 | 0.5559 | 0.7866 | 0.5615 | 0.4337 | 0.6146 | 0.7097 | 0.3942 | 0.6271 | 0.5801 | 0.3451 | 0.567 | 0.5731 | 0.5081 | 0.7206 | 0.6017 |
Claude-3.5-Sonnet (2024-06-20) | TIGER-Lab | unk | 0.7612 | 0.8856 | 0.8023 | 0.773 | 0.7976 | 0.8246 | 0.6153 | 0.7531 | 0.7585 | 0.6385 | 0.7683 | 0.7475 | 0.7667 | 0.8221 | 0.7846 |
DeepSeek-Coder-V2-Instruct | TIGER-Lab | 236 | 0.6363 | 0.7657 | 0.7326 | 0.6686 | 0.6878 | 0.7464 | 0.5175 | 0.6112 | 0.5184 | 0.3506 | 0.6342 | 0.5621 | 0.6813 | 0.7206 | 0.6537 |
Gemma-2-9B | TIGER-Lab | 9 | 0.451 | 0.6457 | 0.4284 | 0.3746 | 0.4122 | 0.5486 | 0.3075 | 0.5232 | 0.4987 | 0.2843 | 0.4041 | 0.4850 | 0.4296 | 0.6353 | 0.5271 |
Gemma-2-9B-it | TIGER-Lab | 9 | 0.5208 | 0.7587 | 0.5539 | 0.4664 | 0.5073 | 0.6552 | 0.3622 | 0.5844 | 0.5354 | 0.3579 | 0.4944 | 0.4950 | 0.4758 | 0.6617 | 0.5498 |
Qwen2-7B-Instruct | TIGER-Lab | 7 | 0.4724 | 0.6625 | 0.5412 | 0.3772 | 0.4634 | 0.5995 | 0.3540 | 0.4645 | 0.4331 | 0.2934 | 0.5803 | 0.4509 | 0.3972 | 0.6128 | 0.4697 |
Qwen2-7B | TIGER-Lab | 7 | 0.4073 | 0.6011 | 0.4423 | 0.2977 | 0.4317 | 0.5213 | 0.2982 | 0.4108 | 0.3832 | 0.2380 | 0.4752 | 0.4269 | 0.3410 | 0.5464 | 0.4329 |
Qwen2-1.5B-Instruct | TIGER-Lab | 1.5 | 0.2262 | 0.3612 | 0.2104 | 0.1449 | 0.2220 | 0.3128 | 0.1600 | 0.2347 | 0.2126 | 0.1653 | 0.2420 | 0.2365 | 0.1740 | 0.3321 | 0.2511 |
Qwen2-1.5B | TIGER-Lab | 1.5 | 0.2256 | 0.3515 | 0.1952 | 0.1466 | 0.2829 | 0.2974 | 0.1486 | 0.2311 | 0.1969 | 0.1589 | 0.2376 | 0.2485 | 0.1778 | 0.3283 | 0.2749 |
Qwen2-0.5B-Instruct | TIGER-Lab | 0.5 | 0.1593 | 0.2538 | 0.1432 | 0.1254 | 0.1585 | 0.1991 | 0.1414 | 0.1553 | 0.1706 | 0.1580 | 0.1384 | 0.1323 | 0.1386 | 0.2180 | 0.1483 |
Qwen2-0.5B | TIGER-Lab | 0.5 | 0.1497 | 0.1855 | 0.1420 | 0.1069 | 0.1488 | 0.2062 | 0.1135 | 0.1491 | 0.1732 | 0.1599 | 0.1488 | 0.1523 | 0.1255 | 0.1992 | 0.1374 |
DeepSeek-Coder-V2-Lite-Base | TIGER-Lab | 16 | 0.3437 | 0.4114 | 0.3777 | 0.3366 | 0.3780 | 0.4727 | 0.3127 | 0.2848 | 0.2572 | 0.1589 | 0.4086 | 0.2725 | 0.3811 | 0.3997 | 0.3214 |
DeepSeek-Coder-V2-Lite-Instruct | TIGER-Lab | 16 | 0.4157 | 0.5007 | 0.5463 | 0.4293 | 0.4756 | 0.5344 | 0.3437 | 0.2995 | 0.3123 | 0.1880 | 0.5263 | 0.3006 | 0.4473 | 0.4687 | 0.3896 |
Mathstral-7B-v0.1 | TIGER-Lab | 7 | 0.42 | 0.6346 | 0.4208 | 0.3878 | 0.4561 | 0.5178 | 0.3839 | 0.3924 | 0.3596 | 0.2243 | 0.4767 | 0.3828 | 0.3803 | 0.5263 | 0.4091 |
GPT-4o-mini | TIGER-Lab | unk | 0.6309 | 0.802 | 0.706 | 0.6299 | 0.6707 | 0.7334 | 0.3942 | 0.676 | 0.5879 | 0.3724 | 0.7232 | 0.5591 | 0.6366 | 0.7381 | 0.6613 |
magnum-72b-v1 | TIGER-Lab | 72 | 0.6393 | 0.8219 | 0.6339 | 0.5967 | 0.7116 | 0.7497 | 0.4847 | 0.6626 | 0.6706 | 0.4378 | 0.6737 | 0.6017 | 0.6020 | 0.7657 | 0.6692 |
WizardLM-2-8x22B | TIGER-Lab | 176 | 0.3924 | 0.6234 | 0.5488 | 0.3772 | 0.5045 | 0.5908 | 0.1498 | 0.4854 | 0.4831 | 0.2852 | 0.1972 | 0.5374 | 0.2602 | 0.4825 | 0.4245 |
Mistral-Nemo-Instruct-2407 | TIGER-Lab | 12 | 0.4481 | 0.6583 | 0.4715 | 0.3445 | 0.4463 | 0.5806 | 0.3148 | 0.5281 | 0.4829 | 0.3106 | 0.4241 | 0.4529 | 0.3695 | 0.6165 | 0.4881 |
Mistral-Nemo-Base-2407 | TIGER-Lab | 12 | 0.3977 | 0.6011 | 0.3866 | 0.3127 | 0.3805 | 0.4775 | 0.3013 | 0.4866 | 0.4383 | 0.2343 | 0.3257 | 0.4810 | 0.3464 | 0.5890 | 0.4556 |
Llama-3.1-8B | TIGER-Lab | 8 | 0.366 | 0.5635 | 0.3308 | 0.2588 | 0.3732 | 0.4491 | 0.2859 | 0.4450 | 0.4173 | 0.2107 | 0.3323 | 0.4028 | 0.3118 | 0.5313 | 0.4361 |
Llama-3.1-8B-Instruct | TIGER-Lab | 8 | 0.4425 | 0.6304 | 0.4930 | 0.3763 | 0.4829 | 0.5509 | 0.2972 | 0.5073 | 0.4226 | 0.2725 | 0.4382 | 0.4449 | 0.4026 | 0.6003 | 0.4481 |
Llama-3.1-70B | TIGER-Lab | 70 | 0.5247 | 0.7462 | 0.4880 | 0.4496 | 0.5195 | 0.6209 | 0.3777 | 0.6015 | 0.5827 | 0.3224 | 0.5056 | 0.5691 | 0.4896 | 0.7143 | 0.5833 |
Llama-3.1-70B-Instruct | TIGER-Lab | 70 | 0.6284 | 0.8117 | 0.6641 | 0.5910 | 0.6634 | 0.7524 | 0.4582 | 0.6846 | 0.6614 | 0.4696 | 0.6047 | 0.6172 | 0.5912 | 0.7556 | 0.6602 |
Grok-2 | Self-Reported | unk | 0.7546 | 0.8842 | 0.7896 | 0.7703 | 0.7585 | 0.8187 | 0.6078 | 0.7592 | 0.6982 | 0.6167 | 0.7927 | 0.7234 | 0.7729 | 0.8133 | 0.7662 |
Grok-2-mini | Self-Reported | unk | 0.7185 | 0.8465 | 0.7566 | 0.7429 | 0.7317 | 0.8092 | 0.5624 | 0.7518 | 0.6719 | 0.5367 | 0.7609 | 0.6553 | 0.7328 | 0.7994 | 0.7197 |
Phi-3.5-mini-instruct | TIGER-Lab | 3.8 | 0.4787 | 0.7057 | 0.5349 | 0.4125 | 0.5195 | 0.6386 | 0.3075 | 0.5244 | 0.4252 | 0.2943 | 0.4900 | 0.5000 | 0.4509 | 0.4188 | 0.6353 |
Reflection-Llama-3.1-70B | TIGER-Lab | 70 | 0.6035 | 0.7950 | 0.6324 | 0.5433 | 0.6268 | 0.7370 | 0.4396 | 0.6907 | 0.6194 | 0.4242 | 0.6136 | 0.5772 | 0.5327 | 0.7444 | 0.6504 |
GPT-4o (2024-08-06) | TIGER-Lab | unk | 0.7468 | 0.8926 | 0.801 | 0.727 | 0.7829 | 0.8164 | 0.5531 | 0.7604 | 0.7323 | 0.5895 | 0.7942 | 0.7034 | 0.7506 | 0.8271 | 0.7955 |
DeepSeek-Chat-V2_5 | TIGER-Lab | 236 | 0.6583 | 0.8271 | 0.7364 | 0.6979 | 0.7098 | 0.7678 | 0.517 | 0.6247 | 0.5564 | 0.3715 | 0.7535 | 0.5631 | 0.7052 | 0.7268 | 0.6385 |
Qwen2.5-3B | Self-Reported | 3 | 0.4373 | 0.5453 | 0.5412 | 0.4072 | 0.4317 | 0.5296 | 0.2921 | 0.4401 | 0.3911 | 0.2234 | 0.5455 | 0.3707 | 0.4403 | 0.5551 | 0.4145 |
Qwen2.5-72B | Self-Reported | 72 | 0.7159 | 0.8326 | 0.7693 | 0.7314 | 0.7488 | 0.8104 | 0.5645 | 0.6956 | 0.6745 | 0.4914 | 0.8120 | 0.6473 | 0.7498 | 0.7857 | 0.7100 |
Qwen2.5-14B | Self-Reported | 14 | 0.6369 | 0.7978 | 0.7085 | 0.6873 | 0.6707 | 0.7310 | 0.4954 | 0.6222 | 0.5774 | 0.3660 | 0.6788 | 0.5711 | 0.6844 | 0.7243 | 0.6288 |
Qwen2.5-32B | Self-Reported | 32 | 0.6923 | 0.8396 | 0.7567 | 0.7032 | 0.7390 | 0.7725 | 0.5480 | 0.6932 | 0.5932 | 0.4541 | 0.8053 | 0.6152 | 0.7259 | 0.7569 | 0.6645 |
Qwen2.5-1.5B | Self-Reported | 1.5 | 0.321 | 0.4351 | 0.3739 | 0.2562 | 0.3512 | 0.3886 | 0.1899 | 0.3362 | 0.2782 | 0.1480 | 0.4301 | 0.2786 | 0.2856 | 0.4687 | 0.3247 |
Qwen2.5-0.5B | Self-Reported | 0.5 | 0.1492 | 0.2078 | 0.1458 | 0.1157 | 0.1366 | 0.2251 | 0.1104 | 0.1687 | 0.1312 | 0.1335 | 0.1325 | 0.1323 | 0.1224 | 0.2118 | 0.1504 |
RRD2.5-9B | Self-Reported | 9 | 0.6184 | 0.7824 | 0.6820 | 0.6078 | 0.6561 | 0.7204 | 0.4857 | 0.6198 | 0.5381 | 0.3733 | 0.7365 | 0.5471 | 0.6020 | 0.7118 | 0.6082 |
Gemini-1.5-Flash-002 | TIGER-Lab | unk | 0.6409 | 0.8368 | 0.7145 | 0.6708 | 0.6341 | 0.7628 | 0.407 | 0.6284 | 0.5932 | 0.4286 | 0.6255 | 0.6052 | 0.7141 | 0.7623 | 0.6453 |
Jamba-1.5-Large | TIGER-Lab | 399 | 0.4946 | 0.7713 | 0.5792 | 0.2995 | 0.5463 | 0.6635 | 0.3344 | 0.6064 | 0.5564 | 0.4405 | 0.3205 | 0.5731 | 0.338 | 0.7306 | 0.6061 |
Gemini-1.5-Pro-002 | TIGER-Lab | unk | 0.7025 | 0.8645 | 0.8094 | 0.6221 | 0.7122 | 0.8171 | 0.5899 | 0.7479 | 0.7008 | 0.5522 | 0.5174 | 0.7234 | 0.8072 | 0.8294 | 0.7359 |
Llama-3.1-Nemotron-70B-Instruct-HF | TIGER-Lab | 70 | 0.6278 | 0.7992 | 0.6793 | 0.5963 | 0.6829 | 0.7642 | 0.4045 | 0.6797 | 0.6352 | 0.4687 | 0.6306 | 0.6012 | 0.5989 | 0.7268 | 0.6807 |
Ministral-8B-Instruct-2410 | TIGER-Lab | 8 | 0.3793 | 0.5900 | 0.3942 | 0.2641 | 0.4488 | 0.4929 | 0.2312 | 0.4328 | 0.4383 | 0.2598 | 0.4115 | 0.3647 | 0.2918 | 0.5163 | 0.4015 |
Claude-3.5-Sonnet (2024-10-22) | TIGER-LAb | unk | 0.7764 | 0.8856 | 0.8137 | 0.7853 | 0.8244 | 0.859 | 0.613 | 0.7689 | 0.7375 | 0.6458 | 0.8105 | 0.7675 | 0.7729 | 0.8459 | 0.8019 |
Mistral-Small-Instruct-2409 | TIGER-Lab | 22 | 0.484 | 0.7169 | 0.5272 | 0.3684 | 0.5366 | 0.6007 | 0.3055 | 0.5379 | 0.5013 | 0.3197 | 0.5085 | 0.4810 | 0.4034 | 0.6391 | 0.5509 |
Yi-Lightning | TIGER-Lab | unk | 0.6238 | 0.7964 | 0.6907 | 0.6193 | 0.6439 | 0.731 | 0.4221 | 0.6553 | 0.5748 | 0.3751 | 0.6913 | 0.5711 | 0.6251 | 0.7293 | 0.6677 |
SmolLM-135M | TIGER-Lab | 0.135 | 0.1122 | 0.1227 | 0.1191 | 0.1051 | 0.1073 | 0.1173 | 0.0939 | 0.1247 | 0.0945 | 0.1153 | 0.1140 | 0.1563 | 0.0970 | 0.1090 | 0.1136 |
End of preview. Expand
in Dataset Viewer.
README.md exists but content is empty.
- Downloads last month
- 1