tomaarsen HF staff commited on
Commit
60d0fcf
1 Parent(s): 5d61106

Add new SentenceTransformer model

Browse files
1_Pooling/config.json ADDED
@@ -0,0 +1,10 @@
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "word_embedding_dimension": 768,
3
+ "pooling_mode_cls_token": false,
4
+ "pooling_mode_mean_tokens": true,
5
+ "pooling_mode_max_tokens": false,
6
+ "pooling_mode_mean_sqrt_len_tokens": false,
7
+ "pooling_mode_weightedmean_tokens": false,
8
+ "pooling_mode_lasttoken": false,
9
+ "include_prompt": true
10
+ }
README.md ADDED
@@ -0,0 +1,1310 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ language:
3
+ - en
4
+ license: apache-2.0
5
+ tags:
6
+ - sentence-transformers
7
+ - sentence-similarity
8
+ - feature-extraction
9
+ - generated_from_trainer
10
+ - dataset_size:3012496
11
+ - loss:MatryoshkaLoss
12
+ - loss:CachedMultipleNegativesRankingLoss
13
+ base_model: microsoft/mpnet-base
14
+ widget:
15
+ - source_sentence: how to sign legal documents as power of attorney?
16
+ sentences:
17
+ - 'After the principal''s name, write “by” and then sign your own name. Under or
18
+ after the signature line, indicate your status as POA by including any of the
19
+ following identifiers: as POA, as Agent, as Attorney in Fact or as Power of Attorney.'
20
+ - '[''From the Home screen, swipe left to Apps.'', ''Tap Transfer my Data.'', ''Tap
21
+ Menu (...).'', ''Tap Export to SD card.'']'
22
+ - Ginger Dank Nugs (Grape) - 350mg. Feast your eyes on these unique and striking
23
+ gourmet chocolates; Coco Nugs created by Ginger Dank. Crafted to resemble perfect
24
+ nugs of cannabis, each of the 10 buds contains 35mg of THC. ... This is a perfect
25
+ product for both cannabis and chocolate lovers, who appreciate a little twist.
26
+ - source_sentence: how to delete vdom in fortigate?
27
+ sentences:
28
+ - Go to System -> VDOM -> VDOM2 and select 'Delete'. This VDOM is now successfully
29
+ removed from the configuration.
30
+ - 'Both combination birth control pills and progestin-only pills may cause headaches
31
+ as a side effect. Additional side effects of birth control pills may include:
32
+ breast tenderness. nausea.'
33
+ - White cheese tends to show imperfections more readily and as consumers got more
34
+ used to yellow-orange cheese, it became an expected option. Today, many cheddars
35
+ are yellow. While most cheesemakers use annatto, some use an artificial coloring
36
+ agent instead, according to Sachs.
37
+ - source_sentence: where are earthquakes most likely to occur on earth?
38
+ sentences:
39
+ - Zelle in the Bank of the America app is a fast, safe, and easy way to send and
40
+ receive money with family and friends who have a bank account in the U.S., all
41
+ with no fees. Money moves in minutes directly between accounts that are already
42
+ enrolled with Zelle.
43
+ - It takes about 3 days for a spacecraft to reach the Moon. During that time a spacecraft
44
+ travels at least 240,000 miles (386,400 kilometers) which is the distance between
45
+ Earth and the Moon.
46
+ - Most earthquakes occur along the edge of the oceanic and continental plates. The
47
+ earth's crust (the outer layer of the planet) is made up of several pieces, called
48
+ plates. The plates under the oceans are called oceanic plates and the rest are
49
+ continental plates.
50
+ - source_sentence: fix iphone is disabled connect to itunes without itunes?
51
+ sentences:
52
+ - To fix a disabled iPhone or iPad without iTunes, you have to erase your device.
53
+ Click on the "Erase iPhone" option and confirm your selection. Wait for a while
54
+ as the "Find My iPhone" feature will remotely erase your iOS device. Needless
55
+ to say, it will also disable its lock.
56
+ - How Māui brought fire to the world. One evening, after eating a hearty meal, Māui
57
+ lay beside his fire staring into the flames. ... In the middle of the night, while
58
+ everyone was sleeping, Māui went from village to village and extinguished all
59
+ the fires until not a single fire burned in the world.
60
+ - Angry Orchard makes a variety of year-round craft cider styles, including Angry
61
+ Orchard Crisp Apple, a fruit-forward hard cider that balances the sweetness of
62
+ culinary apples with dryness and bright acidity of bittersweet apples for a complex,
63
+ refreshing taste.
64
+ - source_sentence: how to reverse a video on tiktok that's not yours?
65
+ sentences:
66
+ - '[''Tap "Effects" at the bottom of your screen — it\''s an icon that looks like
67
+ a clock. Open the Effects menu. ... '', ''At the end of the new list that appears,
68
+ tap "Time." Select "Time" at the end. ... '', ''Select "Reverse" — you\''ll then
69
+ see a preview of your new, reversed video appear on the screen.'']'
70
+ - Franchise Facts Poke Bar has a franchise fee of up to $30,000, with a total initial
71
+ investment range of $157,800 to $438,000. The initial cost of a franchise includes
72
+ several fees -- Unlock this franchise to better understand the costs such as training
73
+ and territory fees.
74
+ - Relative age is the age of a rock layer (or the fossils it contains) compared
75
+ to other layers. It can be determined by looking at the position of rock layers.
76
+ Absolute age is the numeric age of a layer of rocks or fossils. Absolute age can
77
+ be determined by using radiometric dating.
78
+ datasets:
79
+ - sentence-transformers/gooaq
80
+ pipeline_tag: sentence-similarity
81
+ library_name: sentence-transformers
82
+ metrics:
83
+ - cosine_accuracy@1
84
+ - cosine_accuracy@3
85
+ - cosine_accuracy@5
86
+ - cosine_accuracy@10
87
+ - cosine_precision@1
88
+ - cosine_precision@3
89
+ - cosine_precision@5
90
+ - cosine_precision@10
91
+ - cosine_recall@1
92
+ - cosine_recall@3
93
+ - cosine_recall@5
94
+ - cosine_recall@10
95
+ - cosine_ndcg@10
96
+ - cosine_mrr@10
97
+ - cosine_map@100
98
+ co2_eq_emissions:
99
+ emissions: 901.0176370050929
100
+ energy_consumed: 2.3180164676412596
101
+ source: codecarbon
102
+ training_type: fine-tuning
103
+ on_cloud: false
104
+ cpu_model: 13th Gen Intel(R) Core(TM) i7-13700K
105
+ ram_total_size: 31.777088165283203
106
+ hours_used: 5.999
107
+ hardware_used: 1 x NVIDIA GeForce RTX 3090
108
+ model-index:
109
+ - name: MPNet base trained on GooAQ triplets
110
+ results:
111
+ - task:
112
+ type: information-retrieval
113
+ name: Information Retrieval
114
+ dataset:
115
+ name: NanoClimateFEVER
116
+ type: NanoClimateFEVER
117
+ metrics:
118
+ - type: cosine_accuracy@1
119
+ value: 0.26
120
+ name: Cosine Accuracy@1
121
+ - type: cosine_accuracy@3
122
+ value: 0.46
123
+ name: Cosine Accuracy@3
124
+ - type: cosine_accuracy@5
125
+ value: 0.5
126
+ name: Cosine Accuracy@5
127
+ - type: cosine_accuracy@10
128
+ value: 0.62
129
+ name: Cosine Accuracy@10
130
+ - type: cosine_precision@1
131
+ value: 0.26
132
+ name: Cosine Precision@1
133
+ - type: cosine_precision@3
134
+ value: 0.1733333333333333
135
+ name: Cosine Precision@3
136
+ - type: cosine_precision@5
137
+ value: 0.11600000000000002
138
+ name: Cosine Precision@5
139
+ - type: cosine_precision@10
140
+ value: 0.08199999999999999
141
+ name: Cosine Precision@10
142
+ - type: cosine_recall@1
143
+ value: 0.12833333333333333
144
+ name: Cosine Recall@1
145
+ - type: cosine_recall@3
146
+ value: 0.23566666666666664
147
+ name: Cosine Recall@3
148
+ - type: cosine_recall@5
149
+ value: 0.2523333333333333
150
+ name: Cosine Recall@5
151
+ - type: cosine_recall@10
152
+ value: 0.3423333333333333
153
+ name: Cosine Recall@10
154
+ - type: cosine_ndcg@10
155
+ value: 0.2832168283343785
156
+ name: Cosine Ndcg@10
157
+ - type: cosine_mrr@10
158
+ value: 0.3685714285714285
159
+ name: Cosine Mrr@10
160
+ - type: cosine_map@100
161
+ value: 0.22816684702715823
162
+ name: Cosine Map@100
163
+ - task:
164
+ type: information-retrieval
165
+ name: Information Retrieval
166
+ dataset:
167
+ name: NanoDBPedia
168
+ type: NanoDBPedia
169
+ metrics:
170
+ - type: cosine_accuracy@1
171
+ value: 0.56
172
+ name: Cosine Accuracy@1
173
+ - type: cosine_accuracy@3
174
+ value: 0.78
175
+ name: Cosine Accuracy@3
176
+ - type: cosine_accuracy@5
177
+ value: 0.82
178
+ name: Cosine Accuracy@5
179
+ - type: cosine_accuracy@10
180
+ value: 0.88
181
+ name: Cosine Accuracy@10
182
+ - type: cosine_precision@1
183
+ value: 0.56
184
+ name: Cosine Precision@1
185
+ - type: cosine_precision@3
186
+ value: 0.5
187
+ name: Cosine Precision@3
188
+ - type: cosine_precision@5
189
+ value: 0.436
190
+ name: Cosine Precision@5
191
+ - type: cosine_precision@10
192
+ value: 0.37800000000000006
193
+ name: Cosine Precision@10
194
+ - type: cosine_recall@1
195
+ value: 0.05411706752798353
196
+ name: Cosine Recall@1
197
+ - type: cosine_recall@3
198
+ value: 0.12035295895525228
199
+ name: Cosine Recall@3
200
+ - type: cosine_recall@5
201
+ value: 0.15928246254162917
202
+ name: Cosine Recall@5
203
+ - type: cosine_recall@10
204
+ value: 0.23697530489351543
205
+ name: Cosine Recall@10
206
+ - type: cosine_ndcg@10
207
+ value: 0.4605652479922868
208
+ name: Cosine Ndcg@10
209
+ - type: cosine_mrr@10
210
+ value: 0.6701666666666667
211
+ name: Cosine Mrr@10
212
+ - type: cosine_map@100
213
+ value: 0.313461519912651
214
+ name: Cosine Map@100
215
+ - task:
216
+ type: information-retrieval
217
+ name: Information Retrieval
218
+ dataset:
219
+ name: NanoFEVER
220
+ type: NanoFEVER
221
+ metrics:
222
+ - type: cosine_accuracy@1
223
+ value: 0.62
224
+ name: Cosine Accuracy@1
225
+ - type: cosine_accuracy@3
226
+ value: 0.82
227
+ name: Cosine Accuracy@3
228
+ - type: cosine_accuracy@5
229
+ value: 0.84
230
+ name: Cosine Accuracy@5
231
+ - type: cosine_accuracy@10
232
+ value: 0.9
233
+ name: Cosine Accuracy@10
234
+ - type: cosine_precision@1
235
+ value: 0.62
236
+ name: Cosine Precision@1
237
+ - type: cosine_precision@3
238
+ value: 0.27999999999999997
239
+ name: Cosine Precision@3
240
+ - type: cosine_precision@5
241
+ value: 0.172
242
+ name: Cosine Precision@5
243
+ - type: cosine_precision@10
244
+ value: 0.092
245
+ name: Cosine Precision@10
246
+ - type: cosine_recall@1
247
+ value: 0.5766666666666667
248
+ name: Cosine Recall@1
249
+ - type: cosine_recall@3
250
+ value: 0.7866666666666667
251
+ name: Cosine Recall@3
252
+ - type: cosine_recall@5
253
+ value: 0.8066666666666668
254
+ name: Cosine Recall@5
255
+ - type: cosine_recall@10
256
+ value: 0.8666666666666667
257
+ name: Cosine Recall@10
258
+ - type: cosine_ndcg@10
259
+ value: 0.7421816204572005
260
+ name: Cosine Ndcg@10
261
+ - type: cosine_mrr@10
262
+ value: 0.7256349206349206
263
+ name: Cosine Mrr@10
264
+ - type: cosine_map@100
265
+ value: 0.6984857882513162
266
+ name: Cosine Map@100
267
+ - task:
268
+ type: information-retrieval
269
+ name: Information Retrieval
270
+ dataset:
271
+ name: NanoFiQA2018
272
+ type: NanoFiQA2018
273
+ metrics:
274
+ - type: cosine_accuracy@1
275
+ value: 0.4
276
+ name: Cosine Accuracy@1
277
+ - type: cosine_accuracy@3
278
+ value: 0.52
279
+ name: Cosine Accuracy@3
280
+ - type: cosine_accuracy@5
281
+ value: 0.6
282
+ name: Cosine Accuracy@5
283
+ - type: cosine_accuracy@10
284
+ value: 0.68
285
+ name: Cosine Accuracy@10
286
+ - type: cosine_precision@1
287
+ value: 0.4
288
+ name: Cosine Precision@1
289
+ - type: cosine_precision@3
290
+ value: 0.26
291
+ name: Cosine Precision@3
292
+ - type: cosine_precision@5
293
+ value: 0.188
294
+ name: Cosine Precision@5
295
+ - type: cosine_precision@10
296
+ value: 0.11199999999999999
297
+ name: Cosine Precision@10
298
+ - type: cosine_recall@1
299
+ value: 0.24385714285714286
300
+ name: Cosine Recall@1
301
+ - type: cosine_recall@3
302
+ value: 0.37612698412698414
303
+ name: Cosine Recall@3
304
+ - type: cosine_recall@5
305
+ value: 0.429515873015873
306
+ name: Cosine Recall@5
307
+ - type: cosine_recall@10
308
+ value: 0.5025952380952381
309
+ name: Cosine Recall@10
310
+ - type: cosine_ndcg@10
311
+ value: 0.43956943866243664
312
+ name: Cosine Ndcg@10
313
+ - type: cosine_mrr@10
314
+ value: 0.48483333333333334
315
+ name: Cosine Mrr@10
316
+ - type: cosine_map@100
317
+ value: 0.39610909278538586
318
+ name: Cosine Map@100
319
+ - task:
320
+ type: information-retrieval
321
+ name: Information Retrieval
322
+ dataset:
323
+ name: NanoHotpotQA
324
+ type: NanoHotpotQA
325
+ metrics:
326
+ - type: cosine_accuracy@1
327
+ value: 0.6
328
+ name: Cosine Accuracy@1
329
+ - type: cosine_accuracy@3
330
+ value: 0.72
331
+ name: Cosine Accuracy@3
332
+ - type: cosine_accuracy@5
333
+ value: 0.78
334
+ name: Cosine Accuracy@5
335
+ - type: cosine_accuracy@10
336
+ value: 0.84
337
+ name: Cosine Accuracy@10
338
+ - type: cosine_precision@1
339
+ value: 0.6
340
+ name: Cosine Precision@1
341
+ - type: cosine_precision@3
342
+ value: 0.31999999999999995
343
+ name: Cosine Precision@3
344
+ - type: cosine_precision@5
345
+ value: 0.204
346
+ name: Cosine Precision@5
347
+ - type: cosine_precision@10
348
+ value: 0.11799999999999997
349
+ name: Cosine Precision@10
350
+ - type: cosine_recall@1
351
+ value: 0.3
352
+ name: Cosine Recall@1
353
+ - type: cosine_recall@3
354
+ value: 0.48
355
+ name: Cosine Recall@3
356
+ - type: cosine_recall@5
357
+ value: 0.51
358
+ name: Cosine Recall@5
359
+ - type: cosine_recall@10
360
+ value: 0.59
361
+ name: Cosine Recall@10
362
+ - type: cosine_ndcg@10
363
+ value: 0.5463522282651155
364
+ name: Cosine Ndcg@10
365
+ - type: cosine_mrr@10
366
+ value: 0.6749126984126984
367
+ name: Cosine Mrr@10
368
+ - type: cosine_map@100
369
+ value: 0.4777656892588857
370
+ name: Cosine Map@100
371
+ - task:
372
+ type: information-retrieval
373
+ name: Information Retrieval
374
+ dataset:
375
+ name: NanoMSMARCO
376
+ type: NanoMSMARCO
377
+ metrics:
378
+ - type: cosine_accuracy@1
379
+ value: 0.26
380
+ name: Cosine Accuracy@1
381
+ - type: cosine_accuracy@3
382
+ value: 0.54
383
+ name: Cosine Accuracy@3
384
+ - type: cosine_accuracy@5
385
+ value: 0.7
386
+ name: Cosine Accuracy@5
387
+ - type: cosine_accuracy@10
388
+ value: 0.82
389
+ name: Cosine Accuracy@10
390
+ - type: cosine_precision@1
391
+ value: 0.26
392
+ name: Cosine Precision@1
393
+ - type: cosine_precision@3
394
+ value: 0.18
395
+ name: Cosine Precision@3
396
+ - type: cosine_precision@5
397
+ value: 0.14
398
+ name: Cosine Precision@5
399
+ - type: cosine_precision@10
400
+ value: 0.08199999999999999
401
+ name: Cosine Precision@10
402
+ - type: cosine_recall@1
403
+ value: 0.26
404
+ name: Cosine Recall@1
405
+ - type: cosine_recall@3
406
+ value: 0.54
407
+ name: Cosine Recall@3
408
+ - type: cosine_recall@5
409
+ value: 0.7
410
+ name: Cosine Recall@5
411
+ - type: cosine_recall@10
412
+ value: 0.82
413
+ name: Cosine Recall@10
414
+ - type: cosine_ndcg@10
415
+ value: 0.5254388867327386
416
+ name: Cosine Ndcg@10
417
+ - type: cosine_mrr@10
418
+ value: 0.43241269841269836
419
+ name: Cosine Mrr@10
420
+ - type: cosine_map@100
421
+ value: 0.44192370495002076
422
+ name: Cosine Map@100
423
+ - task:
424
+ type: information-retrieval
425
+ name: Information Retrieval
426
+ dataset:
427
+ name: NanoNFCorpus
428
+ type: NanoNFCorpus
429
+ metrics:
430
+ - type: cosine_accuracy@1
431
+ value: 0.42
432
+ name: Cosine Accuracy@1
433
+ - type: cosine_accuracy@3
434
+ value: 0.52
435
+ name: Cosine Accuracy@3
436
+ - type: cosine_accuracy@5
437
+ value: 0.54
438
+ name: Cosine Accuracy@5
439
+ - type: cosine_accuracy@10
440
+ value: 0.64
441
+ name: Cosine Accuracy@10
442
+ - type: cosine_precision@1
443
+ value: 0.42
444
+ name: Cosine Precision@1
445
+ - type: cosine_precision@3
446
+ value: 0.3533333333333333
447
+ name: Cosine Precision@3
448
+ - type: cosine_precision@5
449
+ value: 0.29600000000000004
450
+ name: Cosine Precision@5
451
+ - type: cosine_precision@10
452
+ value: 0.22999999999999995
453
+ name: Cosine Precision@10
454
+ - type: cosine_recall@1
455
+ value: 0.024846889440892198
456
+ name: Cosine Recall@1
457
+ - type: cosine_recall@3
458
+ value: 0.050109275117862714
459
+ name: Cosine Recall@3
460
+ - type: cosine_recall@5
461
+ value: 0.06353201637623539
462
+ name: Cosine Recall@5
463
+ - type: cosine_recall@10
464
+ value: 0.08853093525637233
465
+ name: Cosine Recall@10
466
+ - type: cosine_ndcg@10
467
+ value: 0.2784279013606366
468
+ name: Cosine Ndcg@10
469
+ - type: cosine_mrr@10
470
+ value: 0.48200000000000004
471
+ name: Cosine Mrr@10
472
+ - type: cosine_map@100
473
+ value: 0.1099281411687893
474
+ name: Cosine Map@100
475
+ - task:
476
+ type: information-retrieval
477
+ name: Information Retrieval
478
+ dataset:
479
+ name: NanoNQ
480
+ type: NanoNQ
481
+ metrics:
482
+ - type: cosine_accuracy@1
483
+ value: 0.46
484
+ name: Cosine Accuracy@1
485
+ - type: cosine_accuracy@3
486
+ value: 0.64
487
+ name: Cosine Accuracy@3
488
+ - type: cosine_accuracy@5
489
+ value: 0.68
490
+ name: Cosine Accuracy@5
491
+ - type: cosine_accuracy@10
492
+ value: 0.8
493
+ name: Cosine Accuracy@10
494
+ - type: cosine_precision@1
495
+ value: 0.46
496
+ name: Cosine Precision@1
497
+ - type: cosine_precision@3
498
+ value: 0.22666666666666666
499
+ name: Cosine Precision@3
500
+ - type: cosine_precision@5
501
+ value: 0.14400000000000002
502
+ name: Cosine Precision@5
503
+ - type: cosine_precision@10
504
+ value: 0.08399999999999999
505
+ name: Cosine Precision@10
506
+ - type: cosine_recall@1
507
+ value: 0.44
508
+ name: Cosine Recall@1
509
+ - type: cosine_recall@3
510
+ value: 0.63
511
+ name: Cosine Recall@3
512
+ - type: cosine_recall@5
513
+ value: 0.67
514
+ name: Cosine Recall@5
515
+ - type: cosine_recall@10
516
+ value: 0.76
517
+ name: Cosine Recall@10
518
+ - type: cosine_ndcg@10
519
+ value: 0.6103091812374759
520
+ name: Cosine Ndcg@10
521
+ - type: cosine_mrr@10
522
+ value: 0.5662380952380953
523
+ name: Cosine Mrr@10
524
+ - type: cosine_map@100
525
+ value: 0.5687228298733515
526
+ name: Cosine Map@100
527
+ - task:
528
+ type: information-retrieval
529
+ name: Information Retrieval
530
+ dataset:
531
+ name: NanoQuoraRetrieval
532
+ type: NanoQuoraRetrieval
533
+ metrics:
534
+ - type: cosine_accuracy@1
535
+ value: 0.92
536
+ name: Cosine Accuracy@1
537
+ - type: cosine_accuracy@3
538
+ value: 0.98
539
+ name: Cosine Accuracy@3
540
+ - type: cosine_accuracy@5
541
+ value: 0.98
542
+ name: Cosine Accuracy@5
543
+ - type: cosine_accuracy@10
544
+ value: 1.0
545
+ name: Cosine Accuracy@10
546
+ - type: cosine_precision@1
547
+ value: 0.92
548
+ name: Cosine Precision@1
549
+ - type: cosine_precision@3
550
+ value: 0.40666666666666657
551
+ name: Cosine Precision@3
552
+ - type: cosine_precision@5
553
+ value: 0.25599999999999995
554
+ name: Cosine Precision@5
555
+ - type: cosine_precision@10
556
+ value: 0.13399999999999998
557
+ name: Cosine Precision@10
558
+ - type: cosine_recall@1
559
+ value: 0.7973333333333332
560
+ name: Cosine Recall@1
561
+ - type: cosine_recall@3
562
+ value: 0.9453333333333334
563
+ name: Cosine Recall@3
564
+ - type: cosine_recall@5
565
+ value: 0.9593333333333334
566
+ name: Cosine Recall@5
567
+ - type: cosine_recall@10
568
+ value: 0.9893333333333334
569
+ name: Cosine Recall@10
570
+ - type: cosine_ndcg@10
571
+ value: 0.9468303023215506
572
+ name: Cosine Ndcg@10
573
+ - type: cosine_mrr@10
574
+ value: 0.948888888888889
575
+ name: Cosine Mrr@10
576
+ - type: cosine_map@100
577
+ value: 0.9245031746031745
578
+ name: Cosine Map@100
579
+ - task:
580
+ type: information-retrieval
581
+ name: Information Retrieval
582
+ dataset:
583
+ name: NanoSCIDOCS
584
+ type: NanoSCIDOCS
585
+ metrics:
586
+ - type: cosine_accuracy@1
587
+ value: 0.34
588
+ name: Cosine Accuracy@1
589
+ - type: cosine_accuracy@3
590
+ value: 0.54
591
+ name: Cosine Accuracy@3
592
+ - type: cosine_accuracy@5
593
+ value: 0.64
594
+ name: Cosine Accuracy@5
595
+ - type: cosine_accuracy@10
596
+ value: 0.76
597
+ name: Cosine Accuracy@10
598
+ - type: cosine_precision@1
599
+ value: 0.34
600
+ name: Cosine Precision@1
601
+ - type: cosine_precision@3
602
+ value: 0.26
603
+ name: Cosine Precision@3
604
+ - type: cosine_precision@5
605
+ value: 0.21600000000000003
606
+ name: Cosine Precision@5
607
+ - type: cosine_precision@10
608
+ value: 0.148
609
+ name: Cosine Precision@10
610
+ - type: cosine_recall@1
611
+ value: 0.07
612
+ name: Cosine Recall@1
613
+ - type: cosine_recall@3
614
+ value: 0.16
615
+ name: Cosine Recall@3
616
+ - type: cosine_recall@5
617
+ value: 0.22266666666666668
618
+ name: Cosine Recall@5
619
+ - type: cosine_recall@10
620
+ value: 0.3046666666666667
621
+ name: Cosine Recall@10
622
+ - type: cosine_ndcg@10
623
+ value: 0.29180682575954126
624
+ name: Cosine Ndcg@10
625
+ - type: cosine_mrr@10
626
+ value: 0.4679126984126984
627
+ name: Cosine Mrr@10
628
+ - type: cosine_map@100
629
+ value: 0.20981154821773768
630
+ name: Cosine Map@100
631
+ - task:
632
+ type: information-retrieval
633
+ name: Information Retrieval
634
+ dataset:
635
+ name: NanoArguAna
636
+ type: NanoArguAna
637
+ metrics:
638
+ - type: cosine_accuracy@1
639
+ value: 0.24
640
+ name: Cosine Accuracy@1
641
+ - type: cosine_accuracy@3
642
+ value: 0.5
643
+ name: Cosine Accuracy@3
644
+ - type: cosine_accuracy@5
645
+ value: 0.68
646
+ name: Cosine Accuracy@5
647
+ - type: cosine_accuracy@10
648
+ value: 0.82
649
+ name: Cosine Accuracy@10
650
+ - type: cosine_precision@1
651
+ value: 0.24
652
+ name: Cosine Precision@1
653
+ - type: cosine_precision@3
654
+ value: 0.16666666666666663
655
+ name: Cosine Precision@3
656
+ - type: cosine_precision@5
657
+ value: 0.136
658
+ name: Cosine Precision@5
659
+ - type: cosine_precision@10
660
+ value: 0.08199999999999999
661
+ name: Cosine Precision@10
662
+ - type: cosine_recall@1
663
+ value: 0.24
664
+ name: Cosine Recall@1
665
+ - type: cosine_recall@3
666
+ value: 0.5
667
+ name: Cosine Recall@3
668
+ - type: cosine_recall@5
669
+ value: 0.68
670
+ name: Cosine Recall@5
671
+ - type: cosine_recall@10
672
+ value: 0.82
673
+ name: Cosine Recall@10
674
+ - type: cosine_ndcg@10
675
+ value: 0.5108280876289467
676
+ name: Cosine Ndcg@10
677
+ - type: cosine_mrr@10
678
+ value: 0.413579365079365
679
+ name: Cosine Mrr@10
680
+ - type: cosine_map@100
681
+ value: 0.42352200577200577
682
+ name: Cosine Map@100
683
+ - task:
684
+ type: information-retrieval
685
+ name: Information Retrieval
686
+ dataset:
687
+ name: NanoSciFact
688
+ type: NanoSciFact
689
+ metrics:
690
+ - type: cosine_accuracy@1
691
+ value: 0.52
692
+ name: Cosine Accuracy@1
693
+ - type: cosine_accuracy@3
694
+ value: 0.64
695
+ name: Cosine Accuracy@3
696
+ - type: cosine_accuracy@5
697
+ value: 0.72
698
+ name: Cosine Accuracy@5
699
+ - type: cosine_accuracy@10
700
+ value: 0.74
701
+ name: Cosine Accuracy@10
702
+ - type: cosine_precision@1
703
+ value: 0.52
704
+ name: Cosine Precision@1
705
+ - type: cosine_precision@3
706
+ value: 0.22666666666666668
707
+ name: Cosine Precision@3
708
+ - type: cosine_precision@5
709
+ value: 0.16
710
+ name: Cosine Precision@5
711
+ - type: cosine_precision@10
712
+ value: 0.08399999999999999
713
+ name: Cosine Precision@10
714
+ - type: cosine_recall@1
715
+ value: 0.485
716
+ name: Cosine Recall@1
717
+ - type: cosine_recall@3
718
+ value: 0.61
719
+ name: Cosine Recall@3
720
+ - type: cosine_recall@5
721
+ value: 0.705
722
+ name: Cosine Recall@5
723
+ - type: cosine_recall@10
724
+ value: 0.73
725
+ name: Cosine Recall@10
726
+ - type: cosine_ndcg@10
727
+ value: 0.6181538011380482
728
+ name: Cosine Ndcg@10
729
+ - type: cosine_mrr@10
730
+ value: 0.5913333333333333
731
+ name: Cosine Mrr@10
732
+ - type: cosine_map@100
733
+ value: 0.5833669046006453
734
+ name: Cosine Map@100
735
+ - task:
736
+ type: information-retrieval
737
+ name: Information Retrieval
738
+ dataset:
739
+ name: NanoTouche2020
740
+ type: NanoTouche2020
741
+ metrics:
742
+ - type: cosine_accuracy@1
743
+ value: 0.5102040816326531
744
+ name: Cosine Accuracy@1
745
+ - type: cosine_accuracy@3
746
+ value: 0.8163265306122449
747
+ name: Cosine Accuracy@3
748
+ - type: cosine_accuracy@5
749
+ value: 0.8571428571428571
750
+ name: Cosine Accuracy@5
751
+ - type: cosine_accuracy@10
752
+ value: 0.9795918367346939
753
+ name: Cosine Accuracy@10
754
+ - type: cosine_precision@1
755
+ value: 0.5102040816326531
756
+ name: Cosine Precision@1
757
+ - type: cosine_precision@3
758
+ value: 0.510204081632653
759
+ name: Cosine Precision@3
760
+ - type: cosine_precision@5
761
+ value: 0.47346938775510194
762
+ name: Cosine Precision@5
763
+ - type: cosine_precision@10
764
+ value: 0.41020408163265304
765
+ name: Cosine Precision@10
766
+ - type: cosine_recall@1
767
+ value: 0.03893285013079613
768
+ name: Cosine Recall@1
769
+ - type: cosine_recall@3
770
+ value: 0.11588553532033441
771
+ name: Cosine Recall@3
772
+ - type: cosine_recall@5
773
+ value: 0.17562928121209787
774
+ name: Cosine Recall@5
775
+ - type: cosine_recall@10
776
+ value: 0.2858043118244373
777
+ name: Cosine Recall@10
778
+ - type: cosine_ndcg@10
779
+ value: 0.4588632608031716
780
+ name: Cosine Ndcg@10
781
+ - type: cosine_mrr@10
782
+ value: 0.6822238419177193
783
+ name: Cosine Mrr@10
784
+ - type: cosine_map@100
785
+ value: 0.36126308261178003
786
+ name: Cosine Map@100
787
+ - task:
788
+ type: nano-beir
789
+ name: Nano BEIR
790
+ dataset:
791
+ name: NanoBEIR mean
792
+ type: NanoBEIR_mean
793
+ metrics:
794
+ - type: cosine_accuracy@1
795
+ value: 0.47001569858712716
796
+ name: Cosine Accuracy@1
797
+ - type: cosine_accuracy@3
798
+ value: 0.6520251177394034
799
+ name: Cosine Accuracy@3
800
+ - type: cosine_accuracy@5
801
+ value: 0.7182417582417582
802
+ name: Cosine Accuracy@5
803
+ - type: cosine_accuracy@10
804
+ value: 0.8061224489795917
805
+ name: Cosine Accuracy@10
806
+ - type: cosine_precision@1
807
+ value: 0.47001569858712716
808
+ name: Cosine Precision@1
809
+ - type: cosine_precision@3
810
+ value: 0.2971951857666143
811
+ name: Cosine Precision@3
812
+ - type: cosine_precision@5
813
+ value: 0.2259591836734694
814
+ name: Cosine Precision@5
815
+ - type: cosine_precision@10
816
+ value: 0.15663108320251176
817
+ name: Cosine Precision@10
818
+ - type: cosine_recall@1
819
+ value: 0.2814682525607806
820
+ name: Cosine Recall@1
821
+ - type: cosine_recall@3
822
+ value: 0.4269339553990077
823
+ name: Cosine Recall@3
824
+ - type: cosine_recall@5
825
+ value: 0.48722766408814117
826
+ name: Cosine Recall@5
827
+ - type: cosine_recall@10
828
+ value: 0.5643773684668895
829
+ name: Cosine Recall@10
830
+ - type: cosine_ndcg@10
831
+ value: 0.5163495085148867
832
+ name: Cosine Ndcg@10
833
+ - type: cosine_mrr@10
834
+ value: 0.5775929206847574
835
+ name: Cosine Mrr@10
836
+ - type: cosine_map@100
837
+ value: 0.4413100253102233
838
+ name: Cosine Map@100
839
+ ---
840
+
841
+ # MPNet base trained on GooAQ triplets
842
+
843
+ This is a [sentence-transformers](https://www.SBERT.net) model finetuned from [microsoft/mpnet-base](https://huggingface.co/microsoft/mpnet-base) on the [gooaq](https://huggingface.co/datasets/sentence-transformers/gooaq) dataset. It maps sentences & paragraphs to a 768-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more.
844
+
845
+ ## Model Details
846
+
847
+ ### Model Description
848
+ - **Model Type:** Sentence Transformer
849
+ - **Base model:** [microsoft/mpnet-base](https://huggingface.co/microsoft/mpnet-base) <!-- at revision 6996ce1e91bd2a9c7d7f61daec37463394f73f09 -->
850
+ - **Maximum Sequence Length:** 512 tokens
851
+ - **Output Dimensionality:** 768 dimensions
852
+ - **Similarity Function:** Cosine Similarity
853
+ - **Training Dataset:**
854
+ - [gooaq](https://huggingface.co/datasets/sentence-transformers/gooaq)
855
+ - **Language:** en
856
+ - **License:** apache-2.0
857
+
858
+ ### Model Sources
859
+
860
+ - **Documentation:** [Sentence Transformers Documentation](https://sbert.net)
861
+ - **Repository:** [Sentence Transformers on GitHub](https://github.com/UKPLab/sentence-transformers)
862
+ - **Hugging Face:** [Sentence Transformers on Hugging Face](https://huggingface.co/models?library=sentence-transformers)
863
+
864
+ ### Full Model Architecture
865
+
866
+ ```
867
+ SentenceTransformer(
868
+ (0): Transformer({'max_seq_length': 512, 'do_lower_case': False}) with Transformer model: MPNetModel
869
+ (1): Pooling({'word_embedding_dimension': 768, 'pooling_mode_cls_token': False, 'pooling_mode_mean_tokens': True, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
870
+ )
871
+ ```
872
+
873
+ ## Usage
874
+
875
+ ### Direct Usage (Sentence Transformers)
876
+
877
+ First install the Sentence Transformers library:
878
+
879
+ ```bash
880
+ pip install -U sentence-transformers
881
+ ```
882
+
883
+ Then you can load this model and run inference.
884
+ ```python
885
+ from sentence_transformers import SentenceTransformer
886
+
887
+ # Download from the 🤗 Hub
888
+ model = SentenceTransformer("tomaarsen/mpnet-base-gooaq-cmnrl-mrl")
889
+ # Run inference
890
+ sentences = [
891
+ "how to reverse a video on tiktok that's not yours?",
892
+ '[\'Tap "Effects" at the bottom of your screen — it\\\'s an icon that looks like a clock. Open the Effects menu. ... \', \'At the end of the new list that appears, tap "Time." Select "Time" at the end. ... \', \'Select "Reverse" — you\\\'ll then see a preview of your new, reversed video appear on the screen.\']',
893
+ 'Relative age is the age of a rock layer (or the fossils it contains) compared to other layers. It can be determined by looking at the position of rock layers. Absolute age is the numeric age of a layer of rocks or fossils. Absolute age can be determined by using radiometric dating.',
894
+ ]
895
+ embeddings = model.encode(sentences)
896
+ print(embeddings.shape)
897
+ # [3, 768]
898
+
899
+ # Get the similarity scores for the embeddings
900
+ similarities = model.similarity(embeddings, embeddings)
901
+ print(similarities.shape)
902
+ # [3, 3]
903
+ ```
904
+
905
+ <!--
906
+ ### Direct Usage (Transformers)
907
+
908
+ <details><summary>Click to see the direct usage in Transformers</summary>
909
+
910
+ </details>
911
+ -->
912
+
913
+ <!--
914
+ ### Downstream Usage (Sentence Transformers)
915
+
916
+ You can finetune this model on your own dataset.
917
+
918
+ <details><summary>Click to expand</summary>
919
+
920
+ </details>
921
+ -->
922
+
923
+ <!--
924
+ ### Out-of-Scope Use
925
+
926
+ *List how the model may foreseeably be misused and address what users ought not to do with the model.*
927
+ -->
928
+
929
+ ## Evaluation
930
+
931
+ ### Metrics
932
+
933
+ #### Information Retrieval
934
+
935
+ * Datasets: `NanoClimateFEVER`, `NanoDBPedia`, `NanoFEVER`, `NanoFiQA2018`, `NanoHotpotQA`, `NanoMSMARCO`, `NanoNFCorpus`, `NanoNQ`, `NanoQuoraRetrieval`, `NanoSCIDOCS`, `NanoArguAna`, `NanoSciFact` and `NanoTouche2020`
936
+ * Evaluated with [<code>InformationRetrievalEvaluator</code>](https://sbert.net/docs/package_reference/sentence_transformer/evaluation.html#sentence_transformers.evaluation.InformationRetrievalEvaluator)
937
+
938
+ | Metric | NanoClimateFEVER | NanoDBPedia | NanoFEVER | NanoFiQA2018 | NanoHotpotQA | NanoMSMARCO | NanoNFCorpus | NanoNQ | NanoQuoraRetrieval | NanoSCIDOCS | NanoArguAna | NanoSciFact | NanoTouche2020 |
939
+ |:--------------------|:-----------------|:------------|:-----------|:-------------|:-------------|:------------|:-------------|:-----------|:-------------------|:------------|:------------|:------------|:---------------|
940
+ | cosine_accuracy@1 | 0.26 | 0.56 | 0.62 | 0.4 | 0.6 | 0.26 | 0.42 | 0.46 | 0.92 | 0.34 | 0.24 | 0.52 | 0.5102 |
941
+ | cosine_accuracy@3 | 0.46 | 0.78 | 0.82 | 0.52 | 0.72 | 0.54 | 0.52 | 0.64 | 0.98 | 0.54 | 0.5 | 0.64 | 0.8163 |
942
+ | cosine_accuracy@5 | 0.5 | 0.82 | 0.84 | 0.6 | 0.78 | 0.7 | 0.54 | 0.68 | 0.98 | 0.64 | 0.68 | 0.72 | 0.8571 |
943
+ | cosine_accuracy@10 | 0.62 | 0.88 | 0.9 | 0.68 | 0.84 | 0.82 | 0.64 | 0.8 | 1.0 | 0.76 | 0.82 | 0.74 | 0.9796 |
944
+ | cosine_precision@1 | 0.26 | 0.56 | 0.62 | 0.4 | 0.6 | 0.26 | 0.42 | 0.46 | 0.92 | 0.34 | 0.24 | 0.52 | 0.5102 |
945
+ | cosine_precision@3 | 0.1733 | 0.5 | 0.28 | 0.26 | 0.32 | 0.18 | 0.3533 | 0.2267 | 0.4067 | 0.26 | 0.1667 | 0.2267 | 0.5102 |
946
+ | cosine_precision@5 | 0.116 | 0.436 | 0.172 | 0.188 | 0.204 | 0.14 | 0.296 | 0.144 | 0.256 | 0.216 | 0.136 | 0.16 | 0.4735 |
947
+ | cosine_precision@10 | 0.082 | 0.378 | 0.092 | 0.112 | 0.118 | 0.082 | 0.23 | 0.084 | 0.134 | 0.148 | 0.082 | 0.084 | 0.4102 |
948
+ | cosine_recall@1 | 0.1283 | 0.0541 | 0.5767 | 0.2439 | 0.3 | 0.26 | 0.0248 | 0.44 | 0.7973 | 0.07 | 0.24 | 0.485 | 0.0389 |
949
+ | cosine_recall@3 | 0.2357 | 0.1204 | 0.7867 | 0.3761 | 0.48 | 0.54 | 0.0501 | 0.63 | 0.9453 | 0.16 | 0.5 | 0.61 | 0.1159 |
950
+ | cosine_recall@5 | 0.2523 | 0.1593 | 0.8067 | 0.4295 | 0.51 | 0.7 | 0.0635 | 0.67 | 0.9593 | 0.2227 | 0.68 | 0.705 | 0.1756 |
951
+ | cosine_recall@10 | 0.3423 | 0.237 | 0.8667 | 0.5026 | 0.59 | 0.82 | 0.0885 | 0.76 | 0.9893 | 0.3047 | 0.82 | 0.73 | 0.2858 |
952
+ | **cosine_ndcg@10** | **0.2832** | **0.4606** | **0.7422** | **0.4396** | **0.5464** | **0.5254** | **0.2784** | **0.6103** | **0.9468** | **0.2918** | **0.5108** | **0.6182** | **0.4589** |
953
+ | cosine_mrr@10 | 0.3686 | 0.6702 | 0.7256 | 0.4848 | 0.6749 | 0.4324 | 0.482 | 0.5662 | 0.9489 | 0.4679 | 0.4136 | 0.5913 | 0.6822 |
954
+ | cosine_map@100 | 0.2282 | 0.3135 | 0.6985 | 0.3961 | 0.4778 | 0.4419 | 0.1099 | 0.5687 | 0.9245 | 0.2098 | 0.4235 | 0.5834 | 0.3613 |
955
+
956
+ #### Nano BEIR
957
+
958
+ * Dataset: `NanoBEIR_mean`
959
+ * Evaluated with [<code>NanoBEIREvaluator</code>](https://sbert.net/docs/package_reference/sentence_transformer/evaluation.html#sentence_transformers.evaluation.NanoBEIREvaluator)
960
+
961
+ | Metric | Value |
962
+ |:--------------------|:-----------|
963
+ | cosine_accuracy@1 | 0.47 |
964
+ | cosine_accuracy@3 | 0.652 |
965
+ | cosine_accuracy@5 | 0.7182 |
966
+ | cosine_accuracy@10 | 0.8061 |
967
+ | cosine_precision@1 | 0.47 |
968
+ | cosine_precision@3 | 0.2972 |
969
+ | cosine_precision@5 | 0.226 |
970
+ | cosine_precision@10 | 0.1566 |
971
+ | cosine_recall@1 | 0.2815 |
972
+ | cosine_recall@3 | 0.4269 |
973
+ | cosine_recall@5 | 0.4872 |
974
+ | cosine_recall@10 | 0.5644 |
975
+ | **cosine_ndcg@10** | **0.5163** |
976
+ | cosine_mrr@10 | 0.5776 |
977
+ | cosine_map@100 | 0.4413 |
978
+
979
+ <!--
980
+ ## Bias, Risks and Limitations
981
+
982
+ *What are the known or foreseeable issues stemming from this model? You could also flag here known failure cases or weaknesses of the model.*
983
+ -->
984
+
985
+ <!--
986
+ ### Recommendations
987
+
988
+ *What are recommendations with respect to the foreseeable issues? For example, filtering explicit content.*
989
+ -->
990
+
991
+ ## Training Details
992
+
993
+ ### Training Dataset
994
+
995
+ #### gooaq
996
+
997
+ * Dataset: [gooaq](https://huggingface.co/datasets/sentence-transformers/gooaq) at [b089f72](https://huggingface.co/datasets/sentence-transformers/gooaq/tree/b089f728748a068b7bc5234e5bcf5b25e3c8279c)
998
+ * Size: 3,012,496 training samples
999
+ * Columns: <code>question</code> and <code>answer</code>
1000
+ * Approximate statistics based on the first 1000 samples:
1001
+ | | question | answer |
1002
+ |:--------|:----------------------------------------------------------------------------------|:------------------------------------------------------------------------------------|
1003
+ | type | string | string |
1004
+ | details | <ul><li>min: 8 tokens</li><li>mean: 11.86 tokens</li><li>max: 21 tokens</li></ul> | <ul><li>min: 14 tokens</li><li>mean: 60.48 tokens</li><li>max: 138 tokens</li></ul> |
1005
+ * Samples:
1006
+ | question | answer |
1007
+ |:-----------------------------------------------------------------------------------|:----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
1008
+ | <code>what is the difference between broilers and layers?</code> | <code>An egg laying poultry is called egger or layer whereas broilers are reared for obtaining meat. So a layer should be able to produce more number of large sized eggs, without growing too much. On the other hand, a broiler should yield more meat and hence should be able to grow well.</code> |
1009
+ | <code>what is the difference between chronological order and spatial order?</code> | <code>As a writer, you should always remember that unlike chronological order and the other organizational methods for data, spatial order does not take into account the time. Spatial order is primarily focused on the location. All it does is take into account the location of objects and not the time.</code> |
1010
+ | <code>is kamagra same as viagra?</code> | <code>Kamagra is thought to contain the same active ingredient as Viagra, sildenafil citrate. In theory, it should work in much the same way as Viagra, taking about 45 minutes to take effect, and lasting for around 4-6 hours. However, this will vary from person to person.</code> |
1011
+ * Loss: [<code>MatryoshkaLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#matryoshkaloss) with these parameters:
1012
+ ```json
1013
+ {
1014
+ "loss": "CachedMultipleNegativesRankingLoss",
1015
+ "matryoshka_dims": [
1016
+ 768,
1017
+ 512,
1018
+ 256,
1019
+ 128,
1020
+ 64,
1021
+ 32
1022
+ ],
1023
+ "matryoshka_weights": [
1024
+ 1,
1025
+ 1,
1026
+ 1,
1027
+ 1,
1028
+ 1,
1029
+ 1
1030
+ ],
1031
+ "n_dims_per_step": -1
1032
+ }
1033
+ ```
1034
+
1035
+ ### Evaluation Dataset
1036
+
1037
+ #### gooaq
1038
+
1039
+ * Dataset: [gooaq](https://huggingface.co/datasets/sentence-transformers/gooaq) at [b089f72](https://huggingface.co/datasets/sentence-transformers/gooaq/tree/b089f728748a068b7bc5234e5bcf5b25e3c8279c)
1040
+ * Size: 3,012,496 evaluation samples
1041
+ * Columns: <code>question</code> and <code>answer</code>
1042
+ * Approximate statistics based on the first 1000 samples:
1043
+ | | question | answer |
1044
+ |:--------|:----------------------------------------------------------------------------------|:------------------------------------------------------------------------------------|
1045
+ | type | string | string |
1046
+ | details | <ul><li>min: 8 tokens</li><li>mean: 11.88 tokens</li><li>max: 22 tokens</li></ul> | <ul><li>min: 14 tokens</li><li>mean: 61.03 tokens</li><li>max: 127 tokens</li></ul> |
1047
+ * Samples:
1048
+ | question | answer |
1049
+ |:-----------------------------------------------------------------------------|:-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
1050
+ | <code>how do i program my directv remote with my tv?</code> | <code>['Press MENU on your remote.', 'Select Settings & Help > Settings > Remote Control > Program Remote.', 'Choose the device (TV, audio, DVD) you wish to program. ... ', 'Follow the on-screen prompts to complete programming.']</code> |
1051
+ | <code>are rodrigues fruit bats nocturnal?</code> | <code>Before its numbers were threatened by habitat destruction, storms, and hunting, some of those groups could number 500 or more members. Sunrise, sunset. Rodrigues fruit bats are most active at dawn, at dusk, and at night.</code> |
1052
+ | <code>why does your heart rate increase during exercise bbc bitesize?</code> | <code>During exercise there is an increase in physical activity and muscle cells respire more than they do when the body is at rest. The heart rate increases during exercise. The rate and depth of breathing increases - this makes sure that more oxygen is absorbed into the blood, and more carbon dioxide is removed from it.</code> |
1053
+ * Loss: [<code>MatryoshkaLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#matryoshkaloss) with these parameters:
1054
+ ```json
1055
+ {
1056
+ "loss": "CachedMultipleNegativesRankingLoss",
1057
+ "matryoshka_dims": [
1058
+ 768,
1059
+ 512,
1060
+ 256,
1061
+ 128,
1062
+ 64,
1063
+ 32
1064
+ ],
1065
+ "matryoshka_weights": [
1066
+ 1,
1067
+ 1,
1068
+ 1,
1069
+ 1,
1070
+ 1,
1071
+ 1
1072
+ ],
1073
+ "n_dims_per_step": -1
1074
+ }
1075
+ ```
1076
+
1077
+ ### Training Hyperparameters
1078
+ #### Non-Default Hyperparameters
1079
+
1080
+ - `eval_strategy`: steps
1081
+ - `per_device_train_batch_size`: 2048
1082
+ - `per_device_eval_batch_size`: 2048
1083
+ - `learning_rate`: 8e-05
1084
+ - `num_train_epochs`: 1
1085
+ - `warmup_ratio`: 0.1
1086
+ - `bf16`: True
1087
+ - `batch_sampler`: no_duplicates
1088
+
1089
+ #### All Hyperparameters
1090
+ <details><summary>Click to expand</summary>
1091
+
1092
+ - `overwrite_output_dir`: False
1093
+ - `do_predict`: False
1094
+ - `eval_strategy`: steps
1095
+ - `prediction_loss_only`: True
1096
+ - `per_device_train_batch_size`: 2048
1097
+ - `per_device_eval_batch_size`: 2048
1098
+ - `per_gpu_train_batch_size`: None
1099
+ - `per_gpu_eval_batch_size`: None
1100
+ - `gradient_accumulation_steps`: 1
1101
+ - `eval_accumulation_steps`: None
1102
+ - `torch_empty_cache_steps`: None
1103
+ - `learning_rate`: 8e-05
1104
+ - `weight_decay`: 0.0
1105
+ - `adam_beta1`: 0.9
1106
+ - `adam_beta2`: 0.999
1107
+ - `adam_epsilon`: 1e-08
1108
+ - `max_grad_norm`: 1.0
1109
+ - `num_train_epochs`: 1
1110
+ - `max_steps`: -1
1111
+ - `lr_scheduler_type`: linear
1112
+ - `lr_scheduler_kwargs`: {}
1113
+ - `warmup_ratio`: 0.1
1114
+ - `warmup_steps`: 0
1115
+ - `log_level`: passive
1116
+ - `log_level_replica`: warning
1117
+ - `log_on_each_node`: True
1118
+ - `logging_nan_inf_filter`: True
1119
+ - `save_safetensors`: True
1120
+ - `save_on_each_node`: False
1121
+ - `save_only_model`: False
1122
+ - `restore_callback_states_from_checkpoint`: False
1123
+ - `no_cuda`: False
1124
+ - `use_cpu`: False
1125
+ - `use_mps_device`: False
1126
+ - `seed`: 42
1127
+ - `data_seed`: None
1128
+ - `jit_mode_eval`: False
1129
+ - `use_ipex`: False
1130
+ - `bf16`: True
1131
+ - `fp16`: False
1132
+ - `fp16_opt_level`: O1
1133
+ - `half_precision_backend`: auto
1134
+ - `bf16_full_eval`: False
1135
+ - `fp16_full_eval`: False
1136
+ - `tf32`: None
1137
+ - `local_rank`: 0
1138
+ - `ddp_backend`: None
1139
+ - `tpu_num_cores`: None
1140
+ - `tpu_metrics_debug`: False
1141
+ - `debug`: []
1142
+ - `dataloader_drop_last`: False
1143
+ - `dataloader_num_workers`: 0
1144
+ - `dataloader_prefetch_factor`: None
1145
+ - `past_index`: -1
1146
+ - `disable_tqdm`: False
1147
+ - `remove_unused_columns`: True
1148
+ - `label_names`: None
1149
+ - `load_best_model_at_end`: False
1150
+ - `ignore_data_skip`: False
1151
+ - `fsdp`: []
1152
+ - `fsdp_min_num_params`: 0
1153
+ - `fsdp_config`: {'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}
1154
+ - `fsdp_transformer_layer_cls_to_wrap`: None
1155
+ - `accelerator_config`: {'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None}
1156
+ - `deepspeed`: None
1157
+ - `label_smoothing_factor`: 0.0
1158
+ - `optim`: adamw_torch
1159
+ - `optim_args`: None
1160
+ - `adafactor`: False
1161
+ - `group_by_length`: False
1162
+ - `length_column_name`: length
1163
+ - `ddp_find_unused_parameters`: None
1164
+ - `ddp_bucket_cap_mb`: None
1165
+ - `ddp_broadcast_buffers`: False
1166
+ - `dataloader_pin_memory`: True
1167
+ - `dataloader_persistent_workers`: False
1168
+ - `skip_memory_metrics`: True
1169
+ - `use_legacy_prediction_loop`: False
1170
+ - `push_to_hub`: False
1171
+ - `resume_from_checkpoint`: None
1172
+ - `hub_model_id`: None
1173
+ - `hub_strategy`: every_save
1174
+ - `hub_private_repo`: False
1175
+ - `hub_always_push`: False
1176
+ - `gradient_checkpointing`: False
1177
+ - `gradient_checkpointing_kwargs`: None
1178
+ - `include_inputs_for_metrics`: False
1179
+ - `include_for_metrics`: []
1180
+ - `eval_do_concat_batches`: True
1181
+ - `fp16_backend`: auto
1182
+ - `push_to_hub_model_id`: None
1183
+ - `push_to_hub_organization`: None
1184
+ - `mp_parameters`:
1185
+ - `auto_find_batch_size`: False
1186
+ - `full_determinism`: False
1187
+ - `torchdynamo`: None
1188
+ - `ray_scope`: last
1189
+ - `ddp_timeout`: 1800
1190
+ - `torch_compile`: False
1191
+ - `torch_compile_backend`: None
1192
+ - `torch_compile_mode`: None
1193
+ - `dispatch_batches`: None
1194
+ - `split_batches`: None
1195
+ - `include_tokens_per_second`: False
1196
+ - `include_num_input_tokens_seen`: False
1197
+ - `neftune_noise_alpha`: None
1198
+ - `optim_target_modules`: None
1199
+ - `batch_eval_metrics`: False
1200
+ - `eval_on_start`: False
1201
+ - `use_liger_kernel`: False
1202
+ - `eval_use_gather_object`: False
1203
+ - `average_tokens_across_devices`: False
1204
+ - `prompts`: None
1205
+ - `batch_sampler`: no_duplicates
1206
+ - `multi_dataset_batch_sampler`: proportional
1207
+
1208
+ </details>
1209
+
1210
+ ### Training Logs
1211
+ | Epoch | Step | Training Loss | Validation Loss | NanoClimateFEVER_cosine_ndcg@10 | NanoDBPedia_cosine_ndcg@10 | NanoFEVER_cosine_ndcg@10 | NanoFiQA2018_cosine_ndcg@10 | NanoHotpotQA_cosine_ndcg@10 | NanoMSMARCO_cosine_ndcg@10 | NanoNFCorpus_cosine_ndcg@10 | NanoNQ_cosine_ndcg@10 | NanoQuoraRetrieval_cosine_ndcg@10 | NanoSCIDOCS_cosine_ndcg@10 | NanoArguAna_cosine_ndcg@10 | NanoSciFact_cosine_ndcg@10 | NanoTouche2020_cosine_ndcg@10 | NanoBEIR_mean_cosine_ndcg@10 |
1212
+ |:------:|:----:|:-------------:|:---------------:|:-------------------------------:|:--------------------------:|:------------------------:|:---------------------------:|:---------------------------:|:--------------------------:|:---------------------------:|:---------------------:|:---------------------------------:|:--------------------------:|:--------------------------:|:--------------------------:|:-----------------------------:|:----------------------------:|
1213
+ | 0 | 0 | - | - | 0.0419 | 0.1123 | 0.0389 | 0.0309 | 0.0746 | 0.1310 | 0.0311 | 0.0397 | 0.6607 | 0.0638 | 0.2616 | 0.1097 | 0.1098 | 0.1312 |
1214
+ | 0.0007 | 1 | 41.9671 | - | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
1215
+ | 0.0682 | 100 | 12.4237 | 1.0176 | 0.3022 | 0.4597 | 0.7934 | 0.4621 | 0.5280 | 0.4849 | 0.2517 | 0.5561 | 0.8988 | 0.3144 | 0.5708 | 0.5755 | 0.4514 | 0.5115 |
1216
+ | 0.1363 | 200 | 3.0536 | 0.6917 | 0.2883 | 0.4588 | 0.7773 | 0.4272 | 0.5264 | 0.5494 | 0.2538 | 0.5837 | 0.9303 | 0.2945 | 0.5493 | 0.5795 | 0.4547 | 0.5133 |
1217
+ | 0.2045 | 300 | 2.2724 | 0.5954 | 0.2944 | 0.4606 | 0.7825 | 0.4522 | 0.5247 | 0.5069 | 0.2554 | 0.5636 | 0.9177 | 0.2861 | 0.5560 | 0.5562 | 0.4667 | 0.5095 |
1218
+ | 0.2727 | 400 | 1.933 | 0.5171 | 0.3027 | 0.4841 | 0.7050 | 0.4406 | 0.4877 | 0.5406 | 0.2768 | 0.6014 | 0.9463 | 0.2989 | 0.5725 | 0.6151 | 0.4680 | 0.5184 |
1219
+ | 0.3408 | 500 | 1.7806 | 0.4745 | 0.3034 | 0.4857 | 0.7537 | 0.4435 | 0.5661 | 0.5529 | 0.2733 | 0.5878 | 0.9470 | 0.3016 | 0.5377 | 0.6073 | 0.4682 | 0.5252 |
1220
+ | 0.4090 | 600 | 1.6253 | 0.4392 | 0.3018 | 0.4790 | 0.7502 | 0.4617 | 0.5478 | 0.5411 | 0.2812 | 0.6220 | 0.9443 | 0.2916 | 0.5210 | 0.5900 | 0.4644 | 0.5228 |
1221
+ | 0.4772 | 700 | 1.5136 | 0.4312 | 0.3175 | 0.4846 | 0.7481 | 0.4168 | 0.5761 | 0.5222 | 0.2825 | 0.6142 | 0.9415 | 0.2888 | 0.5373 | 0.5754 | 0.4675 | 0.5210 |
1222
+ | 0.5453 | 800 | 1.4454 | 0.4022 | 0.3017 | 0.4756 | 0.7307 | 0.4494 | 0.5484 | 0.5184 | 0.2821 | 0.6182 | 0.9440 | 0.2834 | 0.5191 | 0.6071 | 0.4694 | 0.5191 |
1223
+ | 0.6135 | 900 | 1.3711 | 0.3886 | 0.2945 | 0.4602 | 0.7463 | 0.4529 | 0.5433 | 0.5457 | 0.2730 | 0.5972 | 0.9449 | 0.2776 | 0.5183 | 0.6018 | 0.4716 | 0.5175 |
1224
+ | 0.6817 | 1000 | 1.3295 | 0.3688 | 0.2811 | 0.4720 | 0.7275 | 0.4342 | 0.5581 | 0.5418 | 0.2809 | 0.6087 | 0.9421 | 0.2823 | 0.5138 | 0.5729 | 0.4662 | 0.5140 |
1225
+ | 0.7498 | 1100 | 1.267 | 0.3637 | 0.2815 | 0.4666 | 0.7168 | 0.4346 | 0.5348 | 0.5317 | 0.2789 | 0.6056 | 0.9450 | 0.2775 | 0.5117 | 0.6116 | 0.4583 | 0.5119 |
1226
+ | 0.8180 | 1200 | 1.2542 | 0.3514 | 0.2882 | 0.4659 | 0.7275 | 0.4308 | 0.5585 | 0.5373 | 0.2788 | 0.5950 | 0.9433 | 0.2767 | 0.5241 | 0.6141 | 0.4655 | 0.5158 |
1227
+ | 0.8862 | 1300 | 1.2146 | 0.3427 | 0.2932 | 0.4638 | 0.7118 | 0.4453 | 0.5636 | 0.5363 | 0.2788 | 0.6098 | 0.9481 | 0.2825 | 0.5160 | 0.6238 | 0.4619 | 0.5181 |
1228
+ | 0.9543 | 1400 | 1.1892 | 0.3378 | 0.2809 | 0.4610 | 0.7319 | 0.4353 | 0.5397 | 0.5295 | 0.2828 | 0.6029 | 0.9474 | 0.2931 | 0.5078 | 0.6182 | 0.4602 | 0.5147 |
1229
+ | 1.0 | 1467 | - | - | 0.2832 | 0.4606 | 0.7422 | 0.4396 | 0.5464 | 0.5254 | 0.2784 | 0.6103 | 0.9468 | 0.2918 | 0.5108 | 0.6182 | 0.4589 | 0.5163 |
1230
+
1231
+
1232
+ ### Environmental Impact
1233
+ Carbon emissions were measured using [CodeCarbon](https://github.com/mlco2/codecarbon).
1234
+ - **Energy Consumed**: 2.318 kWh
1235
+ - **Carbon Emitted**: 0.901 kg of CO2
1236
+ - **Hours Used**: 5.999 hours
1237
+
1238
+ ### Training Hardware
1239
+ - **On Cloud**: No
1240
+ - **GPU Model**: 1 x NVIDIA GeForce RTX 3090
1241
+ - **CPU Model**: 13th Gen Intel(R) Core(TM) i7-13700K
1242
+ - **RAM Size**: 31.78 GB
1243
+
1244
+ ### Framework Versions
1245
+ - Python: 3.11.6
1246
+ - Sentence Transformers: 3.4.0.dev0
1247
+ - Transformers: 4.46.2
1248
+ - PyTorch: 2.5.0+cu121
1249
+ - Accelerate: 1.1.1
1250
+ - Datasets: 2.20.0
1251
+ - Tokenizers: 0.20.3
1252
+
1253
+ ## Citation
1254
+
1255
+ ### BibTeX
1256
+
1257
+ #### Sentence Transformers
1258
+ ```bibtex
1259
+ @inproceedings{reimers-2019-sentence-bert,
1260
+ title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks",
1261
+ author = "Reimers, Nils and Gurevych, Iryna",
1262
+ booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing",
1263
+ month = "11",
1264
+ year = "2019",
1265
+ publisher = "Association for Computational Linguistics",
1266
+ url = "https://arxiv.org/abs/1908.10084",
1267
+ }
1268
+ ```
1269
+
1270
+ #### MatryoshkaLoss
1271
+ ```bibtex
1272
+ @misc{kusupati2024matryoshka,
1273
+ title={Matryoshka Representation Learning},
1274
+ author={Aditya Kusupati and Gantavya Bhatt and Aniket Rege and Matthew Wallingford and Aditya Sinha and Vivek Ramanujan and William Howard-Snyder and Kaifeng Chen and Sham Kakade and Prateek Jain and Ali Farhadi},
1275
+ year={2024},
1276
+ eprint={2205.13147},
1277
+ archivePrefix={arXiv},
1278
+ primaryClass={cs.LG}
1279
+ }
1280
+ ```
1281
+
1282
+ #### CachedMultipleNegativesRankingLoss
1283
+ ```bibtex
1284
+ @misc{gao2021scaling,
1285
+ title={Scaling Deep Contrastive Learning Batch Size under Memory Limited Setup},
1286
+ author={Luyu Gao and Yunyi Zhang and Jiawei Han and Jamie Callan},
1287
+ year={2021},
1288
+ eprint={2101.06983},
1289
+ archivePrefix={arXiv},
1290
+ primaryClass={cs.LG}
1291
+ }
1292
+ ```
1293
+
1294
+ <!--
1295
+ ## Glossary
1296
+
1297
+ *Clearly define terms in order to be accessible across audiences.*
1298
+ -->
1299
+
1300
+ <!--
1301
+ ## Model Card Authors
1302
+
1303
+ *Lists the people who create the model card, providing recognition and accountability for the detailed work that goes into its construction.*
1304
+ -->
1305
+
1306
+ <!--
1307
+ ## Model Card Contact
1308
+
1309
+ *Provides a way for people who have updates to the Model Card, suggestions, or questions, to contact the Model Card authors.*
1310
+ -->
config.json ADDED
@@ -0,0 +1,24 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "_name_or_path": "microsoft/mpnet-base",
3
+ "architectures": [
4
+ "MPNetModel"
5
+ ],
6
+ "attention_probs_dropout_prob": 0.1,
7
+ "bos_token_id": 0,
8
+ "eos_token_id": 2,
9
+ "hidden_act": "gelu",
10
+ "hidden_dropout_prob": 0.1,
11
+ "hidden_size": 768,
12
+ "initializer_range": 0.02,
13
+ "intermediate_size": 3072,
14
+ "layer_norm_eps": 1e-05,
15
+ "max_position_embeddings": 514,
16
+ "model_type": "mpnet",
17
+ "num_attention_heads": 12,
18
+ "num_hidden_layers": 12,
19
+ "pad_token_id": 1,
20
+ "relative_attention_num_buckets": 32,
21
+ "torch_dtype": "float32",
22
+ "transformers_version": "4.46.2",
23
+ "vocab_size": 30527
24
+ }
config_sentence_transformers.json ADDED
@@ -0,0 +1,10 @@
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "__version__": {
3
+ "sentence_transformers": "3.4.0.dev0",
4
+ "transformers": "4.46.2",
5
+ "pytorch": "2.5.0+cu121"
6
+ },
7
+ "prompts": {},
8
+ "default_prompt_name": null,
9
+ "similarity_fn_name": "cosine"
10
+ }
model.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:fa1f8a3b9d1bf3a1df6a622b2af83742caf5c2f745d1e47743513349c18901b5
3
+ size 437967672
modules.json ADDED
@@ -0,0 +1,14 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ [
2
+ {
3
+ "idx": 0,
4
+ "name": "0",
5
+ "path": "",
6
+ "type": "sentence_transformers.models.Transformer"
7
+ },
8
+ {
9
+ "idx": 1,
10
+ "name": "1",
11
+ "path": "1_Pooling",
12
+ "type": "sentence_transformers.models.Pooling"
13
+ }
14
+ ]
sentence_bert_config.json ADDED
@@ -0,0 +1,4 @@
 
 
 
 
 
1
+ {
2
+ "max_seq_length": 512,
3
+ "do_lower_case": false
4
+ }
special_tokens_map.json ADDED
@@ -0,0 +1,51 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "bos_token": {
3
+ "content": "<s>",
4
+ "lstrip": false,
5
+ "normalized": false,
6
+ "rstrip": false,
7
+ "single_word": false
8
+ },
9
+ "cls_token": {
10
+ "content": "<s>",
11
+ "lstrip": false,
12
+ "normalized": true,
13
+ "rstrip": false,
14
+ "single_word": false
15
+ },
16
+ "eos_token": {
17
+ "content": "</s>",
18
+ "lstrip": false,
19
+ "normalized": false,
20
+ "rstrip": false,
21
+ "single_word": false
22
+ },
23
+ "mask_token": {
24
+ "content": "<mask>",
25
+ "lstrip": true,
26
+ "normalized": false,
27
+ "rstrip": false,
28
+ "single_word": false
29
+ },
30
+ "pad_token": {
31
+ "content": "<pad>",
32
+ "lstrip": false,
33
+ "normalized": false,
34
+ "rstrip": false,
35
+ "single_word": false
36
+ },
37
+ "sep_token": {
38
+ "content": "</s>",
39
+ "lstrip": false,
40
+ "normalized": true,
41
+ "rstrip": false,
42
+ "single_word": false
43
+ },
44
+ "unk_token": {
45
+ "content": "[UNK]",
46
+ "lstrip": false,
47
+ "normalized": false,
48
+ "rstrip": false,
49
+ "single_word": false
50
+ }
51
+ }
tokenizer.json ADDED
The diff for this file is too large to render. See raw diff
 
tokenizer_config.json ADDED
@@ -0,0 +1,65 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "added_tokens_decoder": {
3
+ "0": {
4
+ "content": "<s>",
5
+ "lstrip": false,
6
+ "normalized": false,
7
+ "rstrip": false,
8
+ "single_word": false,
9
+ "special": true
10
+ },
11
+ "1": {
12
+ "content": "<pad>",
13
+ "lstrip": false,
14
+ "normalized": false,
15
+ "rstrip": false,
16
+ "single_word": false,
17
+ "special": true
18
+ },
19
+ "2": {
20
+ "content": "</s>",
21
+ "lstrip": false,
22
+ "normalized": false,
23
+ "rstrip": false,
24
+ "single_word": false,
25
+ "special": true
26
+ },
27
+ "3": {
28
+ "content": "<unk>",
29
+ "lstrip": false,
30
+ "normalized": true,
31
+ "rstrip": false,
32
+ "single_word": false,
33
+ "special": true
34
+ },
35
+ "104": {
36
+ "content": "[UNK]",
37
+ "lstrip": false,
38
+ "normalized": false,
39
+ "rstrip": false,
40
+ "single_word": false,
41
+ "special": true
42
+ },
43
+ "30526": {
44
+ "content": "<mask>",
45
+ "lstrip": true,
46
+ "normalized": false,
47
+ "rstrip": false,
48
+ "single_word": false,
49
+ "special": true
50
+ }
51
+ },
52
+ "bos_token": "<s>",
53
+ "clean_up_tokenization_spaces": false,
54
+ "cls_token": "<s>",
55
+ "do_lower_case": true,
56
+ "eos_token": "</s>",
57
+ "mask_token": "<mask>",
58
+ "model_max_length": 512,
59
+ "pad_token": "<pad>",
60
+ "sep_token": "</s>",
61
+ "strip_accents": null,
62
+ "tokenize_chinese_chars": true,
63
+ "tokenizer_class": "MPNetTokenizer",
64
+ "unk_token": "[UNK]"
65
+ }
vocab.txt ADDED
The diff for this file is too large to render. See raw diff