fblgit leaderboard-pr-bot commited on
Commit
7881e56
1 Parent(s): 4b62fcc

Adding Evaluation Results (#4)

Browse files

- Adding Evaluation Results (3458bf67e3915f7c07b7025b5e04dd9ff29e0763)


Co-authored-by: Open LLM Leaderboard PR Bot <[email protected]>

Files changed (1) hide show
  1. README.md +106 -2
README.md CHANGED
@@ -111,8 +111,98 @@ model-index:
111
  source:
112
  url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard?query=fblgit/UNA-SimpleSmaug-34b-v1beta
113
  name: Open LLM Leaderboard
114
-
115
-
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
116
  ---
117
 
118
  # UNA-SimpleSmaug-34b-v1beta
@@ -177,3 +267,17 @@ To abacusai for making Smaug-34B, the Bagel, and all the magic behind the base m
177
  Detailed results can be found [here](https://huggingface.co/datasets/open-llm-leaderboard/details_fblgit__UNA-SimpleSmaug-34b-v1beta)
178
 
179
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
111
  source:
112
  url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard?query=fblgit/UNA-SimpleSmaug-34b-v1beta
113
  name: Open LLM Leaderboard
114
+ - task:
115
+ type: text-generation
116
+ name: Text Generation
117
+ dataset:
118
+ name: IFEval (0-Shot)
119
+ type: HuggingFaceH4/ifeval
120
+ args:
121
+ num_few_shot: 0
122
+ metrics:
123
+ - type: inst_level_strict_acc and prompt_level_strict_acc
124
+ value: 45.56
125
+ name: strict accuracy
126
+ source:
127
+ url: https://huggingface.co/spaces/open-llm-leaderboard/open_llm_leaderboard?query=fblgit/UNA-SimpleSmaug-34b-v1beta
128
+ name: Open LLM Leaderboard
129
+ - task:
130
+ type: text-generation
131
+ name: Text Generation
132
+ dataset:
133
+ name: BBH (3-Shot)
134
+ type: BBH
135
+ args:
136
+ num_few_shot: 3
137
+ metrics:
138
+ - type: acc_norm
139
+ value: 32.78
140
+ name: normalized accuracy
141
+ source:
142
+ url: https://huggingface.co/spaces/open-llm-leaderboard/open_llm_leaderboard?query=fblgit/UNA-SimpleSmaug-34b-v1beta
143
+ name: Open LLM Leaderboard
144
+ - task:
145
+ type: text-generation
146
+ name: Text Generation
147
+ dataset:
148
+ name: MATH Lvl 5 (4-Shot)
149
+ type: hendrycks/competition_math
150
+ args:
151
+ num_few_shot: 4
152
+ metrics:
153
+ - type: exact_match
154
+ value: 0.15
155
+ name: exact match
156
+ source:
157
+ url: https://huggingface.co/spaces/open-llm-leaderboard/open_llm_leaderboard?query=fblgit/UNA-SimpleSmaug-34b-v1beta
158
+ name: Open LLM Leaderboard
159
+ - task:
160
+ type: text-generation
161
+ name: Text Generation
162
+ dataset:
163
+ name: GPQA (0-shot)
164
+ type: Idavidrein/gpqa
165
+ args:
166
+ num_few_shot: 0
167
+ metrics:
168
+ - type: acc_norm
169
+ value: 8.95
170
+ name: acc_norm
171
+ source:
172
+ url: https://huggingface.co/spaces/open-llm-leaderboard/open_llm_leaderboard?query=fblgit/UNA-SimpleSmaug-34b-v1beta
173
+ name: Open LLM Leaderboard
174
+ - task:
175
+ type: text-generation
176
+ name: Text Generation
177
+ dataset:
178
+ name: MuSR (0-shot)
179
+ type: TAUR-Lab/MuSR
180
+ args:
181
+ num_few_shot: 0
182
+ metrics:
183
+ - type: acc_norm
184
+ value: 11.96
185
+ name: acc_norm
186
+ source:
187
+ url: https://huggingface.co/spaces/open-llm-leaderboard/open_llm_leaderboard?query=fblgit/UNA-SimpleSmaug-34b-v1beta
188
+ name: Open LLM Leaderboard
189
+ - task:
190
+ type: text-generation
191
+ name: Text Generation
192
+ dataset:
193
+ name: MMLU-PRO (5-shot)
194
+ type: TIGER-Lab/MMLU-Pro
195
+ config: main
196
+ split: test
197
+ args:
198
+ num_few_shot: 5
199
+ metrics:
200
+ - type: acc
201
+ value: 39.33
202
+ name: accuracy
203
+ source:
204
+ url: https://huggingface.co/spaces/open-llm-leaderboard/open_llm_leaderboard?query=fblgit/UNA-SimpleSmaug-34b-v1beta
205
+ name: Open LLM Leaderboard
206
  ---
207
 
208
  # UNA-SimpleSmaug-34b-v1beta
 
267
  Detailed results can be found [here](https://huggingface.co/datasets/open-llm-leaderboard/details_fblgit__UNA-SimpleSmaug-34b-v1beta)
268
 
269
 
270
+
271
+ # [Open LLM Leaderboard Evaluation Results](https://huggingface.co/spaces/open-llm-leaderboard/open_llm_leaderboard)
272
+ Detailed results can be found [here](https://huggingface.co/datasets/open-llm-leaderboard/details_fblgit__UNA-SimpleSmaug-34b-v1beta)
273
+
274
+ | Metric |Value|
275
+ |-------------------|----:|
276
+ |Avg. |23.12|
277
+ |IFEval (0-Shot) |45.56|
278
+ |BBH (3-Shot) |32.78|
279
+ |MATH Lvl 5 (4-Shot)| 0.15|
280
+ |GPQA (0-shot) | 8.95|
281
+ |MuSR (0-shot) |11.96|
282
+ |MMLU-PRO (5-shot) |39.33|
283
+