zpn commited on
Commit
f11eb36
·
1 Parent(s): 32275f7

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +3 -3
README.md CHANGED
@@ -39,7 +39,7 @@ This model was trained on `nomic-ai/gpt4all-j-prompt-generations` using `revisio
39
  Results on common sense reasoning benchmarks
40
 
41
  ```
42
- Model BoolQ PIQA HellaSwag WinoGrande ARC-e ARC-c OBQA
43
  ----------------------- ---------- ---------- ----------- ------------ ---------- ---------- ----------
44
  GPT4All-J 6B v1.0 73.4 74.8 63.4 64.7 54.9 36.0 40.2
45
  GPT4All-J v1.1-breezy 74.0 75.1 63.2 63.6 55.4 34.9 38.4
@@ -47,14 +47,14 @@ Results on common sense reasoning benchmarks
47
  GPT4All-J v1.3-groovy 73.6 74.3 63.8 63.5 57.7 35.0 38.8
48
  GPT4All-J Lora 6B 68.6 75.8 66.2 63.5 56.4 35.7 40.2
49
  GPT4All LLaMa Lora 7B 73.1 77.6 72.1 67.8 51.1 40.4 40.2
50
- GPT4All 13B snoozy *83.3* 79.2 75.0 *71.3* 60.9 *44.2* 43.4
51
  Dolly 6B 68.8 77.3 67.6 63.9 62.9 38.7 41.2
52
  Dolly 12B 56.7 75.4 71.0 62.2 *64.6* 38.5 40.4
53
  Alpaca 7B 73.9 77.2 73.9 66.1 59.8 43.3 43.4
54
  Alpaca Lora 7B 74.3 *79.3* 74.0 68.8 56.6 43.9 42.6
55
  GPT-J 6B 65.4 76.2 66.2 64.1 62.2 36.6 38.2
56
  LLama 7B 73.1 77.4 73.0 66.9 52.5 41.4 42.4
57
- LLama 13B 68.5 79.1 *76.2* 70.1 60.0 44.6 42.2
58
  Pythia 6.9B 63.5 76.3 64.0 61.1 61.3 35.2 37.2
59
  Pythia 12B 67.7 76.6 67.3 63.8 63.9 34.8 38.0
60
  Vicuña T5 81.5 64.6 46.3 61.8 49.3 33.3 39.4
 
39
  Results on common sense reasoning benchmarks
40
 
41
  ```
42
+ Model BoolQ PIQA HellaSwag WinoGrande ARC-e ARC-c OBQA
43
  ----------------------- ---------- ---------- ----------- ------------ ---------- ---------- ----------
44
  GPT4All-J 6B v1.0 73.4 74.8 63.4 64.7 54.9 36.0 40.2
45
  GPT4All-J v1.1-breezy 74.0 75.1 63.2 63.6 55.4 34.9 38.4
 
47
  GPT4All-J v1.3-groovy 73.6 74.3 63.8 63.5 57.7 35.0 38.8
48
  GPT4All-J Lora 6B 68.6 75.8 66.2 63.5 56.4 35.7 40.2
49
  GPT4All LLaMa Lora 7B 73.1 77.6 72.1 67.8 51.1 40.4 40.2
50
+ GPT4All 13B snoozy *83.3* 79.2 75.0 *71.3* 60.9 44.2 43.4
51
  Dolly 6B 68.8 77.3 67.6 63.9 62.9 38.7 41.2
52
  Dolly 12B 56.7 75.4 71.0 62.2 *64.6* 38.5 40.4
53
  Alpaca 7B 73.9 77.2 73.9 66.1 59.8 43.3 43.4
54
  Alpaca Lora 7B 74.3 *79.3* 74.0 68.8 56.6 43.9 42.6
55
  GPT-J 6B 65.4 76.2 66.2 64.1 62.2 36.6 38.2
56
  LLama 7B 73.1 77.4 73.0 66.9 52.5 41.4 42.4
57
+ LLama 13B 68.5 79.1 *76.2* 70.1 60.0 *44.6* 42.2
58
  Pythia 6.9B 63.5 76.3 64.0 61.1 61.3 35.2 37.2
59
  Pythia 12B 67.7 76.6 67.3 63.8 63.9 34.8 38.0
60
  Vicuña T5 81.5 64.6 46.3 61.8 49.3 33.3 39.4