Muennighoff eduagarcia commited on
Commit
7aae94f
1 Parent(s): 911be8d

Refactor code: Pull leaderboards and models configurations out of the app.py (#106)

Browse files

- Code refactor: moved leaderboards configs from app.py to .yaml files (21c6649839c15ca043a2cb24373c2f0b6eebc220)
- minor fixes (fd4838e20b8259f72b5979c5516824e36fd051a6)
- Caches model cards and dim_seq_size while first intiating the leaderboard (bbfe97ce69a629d673c2cc254c7d1c4e4caae5d1)
- Fix a weird bug that made the cicklabe model name fail to render in some boards (349b10b04819aa2b28caa334e65c4b6f954d76db)
- Fix column order on refresh (a20529c61b4aede14f52bec0f77666bbd89e593f)
- fix missing German clustering (9066f738f02c40a149891f929cb99446c3f9c504)
- Caches models metadata card to a temporary file to speed up initilization (6f8ad2faabf5533e5c8a879d0325632c0589f5d5)
- Clean some invalid tasks and columns for when loading the leaderboard and using the refresh button (879c7e7b01f9e14e39b654ef7f7a532d3327a071)


Co-authored-by: Eduardo Garcia <[email protected]>

Files changed (5) hide show
  1. .gitignore +2 -1
  2. app.py +0 -0
  3. config.yaml +364 -0
  4. envs.py +48 -0
  5. model_meta.yaml +1160 -0
.gitignore CHANGED
@@ -1 +1,2 @@
1
- *.pyc
 
 
1
+ *.pyc
2
+ model_infos.json
app.py CHANGED
The diff for this file is too large to render. See raw diff
 
config.yaml ADDED
@@ -0,0 +1,364 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ config:
2
+ REPO_ID: "mteb/leaderboard"
3
+ RESULTS_REPO: mteb/results
4
+ LEADERBOARD_NAME: "MTEB Leaderboard"
5
+ tasks:
6
+ BitextMining:
7
+ icon: "🎌"
8
+ metric: f1
9
+ metric_description: "[F1](https://huggingface.co/spaces/evaluate-metric/f1)"
10
+ Classification:
11
+ icon: "❤️"
12
+ metric: accuracy
13
+ metric_description: "[Accuracy](https://huggingface.co/spaces/evaluate-metric/accuracy)"
14
+ Clustering:
15
+ icon: "✨"
16
+ metric: v_measure
17
+ metric_description: "Validity Measure (v_measure)"
18
+ PairClassification:
19
+ icon: "🎭"
20
+ metric: cos_sim_ap
21
+ metric_description: "Average Precision based on Cosine Similarities (cos_sim_ap)"
22
+ Reranking:
23
+ icon: "🥈"
24
+ metric: map
25
+ metric_description: "Mean Average Precision (MAP)"
26
+ Retrieval:
27
+ icon: "🔎"
28
+ metric: ndcg_at_10
29
+ metric_description: "Normalized Discounted Cumulative Gain @ k (ndcg_at_10)"
30
+ STS:
31
+ icon: "🤖"
32
+ metric: cos_sim_spearman
33
+ metric_description: "Spearman correlation based on cosine similarity"
34
+ Summarization:
35
+ icon: "📜"
36
+ metric: cos_sim_spearman
37
+ metric_description: "Spearman correlation based on cosine similarity"
38
+ boards:
39
+ en:
40
+ title: English
41
+ language_long: "English"
42
+ has_overall: true
43
+ acronym: null
44
+ icon: null
45
+ special_icons: null
46
+ credits: null
47
+ tasks:
48
+ Classification:
49
+ - AmazonCounterfactualClassification (en)
50
+ - AmazonPolarityClassification
51
+ - AmazonReviewsClassification (en)
52
+ - Banking77Classification
53
+ - EmotionClassification
54
+ - ImdbClassification
55
+ - MassiveIntentClassification (en)
56
+ - MassiveScenarioClassification (en)
57
+ - MTOPDomainClassification (en)
58
+ - MTOPIntentClassification (en)
59
+ - ToxicConversationsClassification
60
+ - TweetSentimentExtractionClassification
61
+ Clustering:
62
+ - ArxivClusteringP2P
63
+ - ArxivClusteringS2S
64
+ - BiorxivClusteringP2P
65
+ - BiorxivClusteringS2S
66
+ - MedrxivClusteringP2P
67
+ - MedrxivClusteringS2S
68
+ - RedditClustering
69
+ - RedditClusteringP2P
70
+ - StackExchangeClustering
71
+ - StackExchangeClusteringP2P
72
+ - TwentyNewsgroupsClustering
73
+ PairClassification:
74
+ - SprintDuplicateQuestions
75
+ - TwitterSemEval2015
76
+ - TwitterURLCorpus
77
+ Reranking:
78
+ - AskUbuntuDupQuestions
79
+ - MindSmallReranking
80
+ - SciDocsRR
81
+ - StackOverflowDupQuestions
82
+ Retrieval:
83
+ - ArguAna
84
+ - ClimateFEVER
85
+ - CQADupstackRetrieval
86
+ - DBPedia
87
+ - FEVER
88
+ - FiQA2018
89
+ - HotpotQA
90
+ - MSMARCO
91
+ - NFCorpus
92
+ - NQ
93
+ - QuoraRetrieval
94
+ - SCIDOCS
95
+ - SciFact
96
+ - Touche2020
97
+ - TRECCOVID
98
+ STS:
99
+ - BIOSSES
100
+ - SICK-R
101
+ - STS12
102
+ - STS13
103
+ - STS14
104
+ - STS15
105
+ - STS16
106
+ - STS17 (en-en)
107
+ - STS22 (en)
108
+ - STSBenchmark
109
+ Summarization:
110
+ - SummEval
111
+ en-x:
112
+ title: "English-X"
113
+ language_long: "117 (Pairs of: English & other language)"
114
+ has_overall: false
115
+ acronym: null
116
+ icon: null
117
+ special_icons: null
118
+ credits: null
119
+ tasks:
120
+ BitextMining: ['BUCC (de-en)', 'BUCC (fr-en)', 'BUCC (ru-en)', 'BUCC (zh-en)', 'Tatoeba (afr-eng)', 'Tatoeba (amh-eng)', 'Tatoeba (ang-eng)', 'Tatoeba (ara-eng)', 'Tatoeba (arq-eng)', 'Tatoeba (arz-eng)', 'Tatoeba (ast-eng)', 'Tatoeba (awa-eng)', 'Tatoeba (aze-eng)', 'Tatoeba (bel-eng)', 'Tatoeba (ben-eng)', 'Tatoeba (ber-eng)', 'Tatoeba (bos-eng)', 'Tatoeba (bre-eng)', 'Tatoeba (bul-eng)', 'Tatoeba (cat-eng)', 'Tatoeba (cbk-eng)', 'Tatoeba (ceb-eng)', 'Tatoeba (ces-eng)', 'Tatoeba (cha-eng)', 'Tatoeba (cmn-eng)', 'Tatoeba (cor-eng)', 'Tatoeba (csb-eng)', 'Tatoeba (cym-eng)', 'Tatoeba (dan-eng)', 'Tatoeba (deu-eng)', 'Tatoeba (dsb-eng)', 'Tatoeba (dtp-eng)', 'Tatoeba (ell-eng)', 'Tatoeba (epo-eng)', 'Tatoeba (est-eng)', 'Tatoeba (eus-eng)', 'Tatoeba (fao-eng)', 'Tatoeba (fin-eng)', 'Tatoeba (fra-eng)', 'Tatoeba (fry-eng)', 'Tatoeba (gla-eng)', 'Tatoeba (gle-eng)', 'Tatoeba (glg-eng)', 'Tatoeba (gsw-eng)', 'Tatoeba (heb-eng)', 'Tatoeba (hin-eng)', 'Tatoeba (hrv-eng)', 'Tatoeba (hsb-eng)', 'Tatoeba (hun-eng)', 'Tatoeba (hye-eng)', 'Tatoeba (ido-eng)', 'Tatoeba (ile-eng)', 'Tatoeba (ina-eng)', 'Tatoeba (ind-eng)', 'Tatoeba (isl-eng)', 'Tatoeba (ita-eng)', 'Tatoeba (jav-eng)', 'Tatoeba (jpn-eng)', 'Tatoeba (kab-eng)', 'Tatoeba (kat-eng)', 'Tatoeba (kaz-eng)', 'Tatoeba (khm-eng)', 'Tatoeba (kor-eng)', 'Tatoeba (kur-eng)', 'Tatoeba (kzj-eng)', 'Tatoeba (lat-eng)', 'Tatoeba (lfn-eng)', 'Tatoeba (lit-eng)', 'Tatoeba (lvs-eng)', 'Tatoeba (mal-eng)', 'Tatoeba (mar-eng)', 'Tatoeba (max-eng)', 'Tatoeba (mhr-eng)', 'Tatoeba (mkd-eng)', 'Tatoeba (mon-eng)', 'Tatoeba (nds-eng)', 'Tatoeba (nld-eng)', 'Tatoeba (nno-eng)', 'Tatoeba (nob-eng)', 'Tatoeba (nov-eng)', 'Tatoeba (oci-eng)', 'Tatoeba (orv-eng)', 'Tatoeba (pam-eng)', 'Tatoeba (pes-eng)', 'Tatoeba (pms-eng)', 'Tatoeba (pol-eng)', 'Tatoeba (por-eng)', 'Tatoeba (ron-eng)', 'Tatoeba (rus-eng)', 'Tatoeba (slk-eng)', 'Tatoeba (slv-eng)', 'Tatoeba (spa-eng)', 'Tatoeba (sqi-eng)', 'Tatoeba (srp-eng)', 'Tatoeba (swe-eng)', 'Tatoeba (swg-eng)', 'Tatoeba (swh-eng)', 'Tatoeba (tam-eng)', 'Tatoeba (tat-eng)', 'Tatoeba (tel-eng)', 'Tatoeba (tgl-eng)', 'Tatoeba (tha-eng)', 'Tatoeba (tuk-eng)', 'Tatoeba (tur-eng)', 'Tatoeba (tzl-eng)', 'Tatoeba (uig-eng)', 'Tatoeba (ukr-eng)', 'Tatoeba (urd-eng)', 'Tatoeba (uzb-eng)', 'Tatoeba (vie-eng)', 'Tatoeba (war-eng)', 'Tatoeba (wuu-eng)', 'Tatoeba (xho-eng)', 'Tatoeba (yid-eng)', 'Tatoeba (yue-eng)', 'Tatoeba (zsm-eng)']
121
+ zh:
122
+ title: Chinese
123
+ language_long: Chinese
124
+ has_overall: true
125
+ acronym: C-MTEB
126
+ icon: "🇨🇳"
127
+ special_icons:
128
+ Classification: "🧡"
129
+ credits: "[FlagEmbedding](https://github.com/FlagOpen/FlagEmbedding)"
130
+ tasks:
131
+ Classification:
132
+ - AmazonReviewsClassification (zh)
133
+ - IFlyTek
134
+ - JDReview
135
+ - MassiveIntentClassification (zh-CN)
136
+ - MassiveScenarioClassification (zh-CN)
137
+ - MultilingualSentiment
138
+ - OnlineShopping
139
+ - TNews
140
+ - Waimai
141
+ Clustering:
142
+ - CLSClusteringP2P
143
+ - CLSClusteringS2S
144
+ - ThuNewsClusteringP2P
145
+ - ThuNewsClusteringS2S
146
+ PairClassification:
147
+ - Cmnli
148
+ - Ocnli
149
+ Reranking:
150
+ - CMedQAv1
151
+ - CMedQAv2
152
+ - MMarcoReranking
153
+ - T2Reranking
154
+ Retrieval:
155
+ - CmedqaRetrieval
156
+ - CovidRetrieval
157
+ - DuRetrieval
158
+ - EcomRetrieval
159
+ - MedicalRetrieval
160
+ - MMarcoRetrieval
161
+ - T2Retrieval
162
+ - VideoRetrieval
163
+ STS:
164
+ - AFQMC
165
+ - ATEC
166
+ - BQ
167
+ - LCQMC
168
+ - PAWSX
169
+ - QBQTC
170
+ - STS22 (zh)
171
+ - STSB
172
+ da:
173
+ title: Danish
174
+ language_long: Danish
175
+ has_overall: false
176
+ acronym: null
177
+ icon: "🇩🇰"
178
+ special_icons:
179
+ Classification: "🤍"
180
+ credits: "[Kenneth Enevoldsen](https://github.com/KennethEnevoldsen), [scandinavian-embedding-benchmark](https://kennethenevoldsen.github.io/scandinavian-embedding-benchmark/)"
181
+ tasks:
182
+ BitextMining:
183
+ - BornholmBitextMining
184
+ Classification:
185
+ - AngryTweetsClassification
186
+ - DanishPoliticalCommentsClassification
187
+ - DKHateClassification
188
+ - LccSentimentClassification
189
+ - MassiveIntentClassification (da)
190
+ - MassiveScenarioClassification (da)
191
+ - NordicLangClassification
192
+ - ScalaDaClassification
193
+ fr:
194
+ title: French
195
+ language_long: "French"
196
+ has_overall: true
197
+ acronym: "F-MTEB"
198
+ icon: "🇫🇷"
199
+ special_icons:
200
+ Classification: "💙"
201
+ credits: "[Lyon-NLP](https://github.com/Lyon-NLP): [Gabriel Sequeira](https://github.com/GabrielSequeira), [Imene Kerboua](https://github.com/imenelydiaker), [Wissam Siblini](https://github.com/wissam-sib), [Mathieu Ciancone](https://github.com/MathieuCiancone), [Marion Schaeffer](https://github.com/schmarion)"
202
+ tasks:
203
+ Classification:
204
+ - AmazonReviewsClassification (fr)
205
+ - MasakhaNEWSClassification (fra)
206
+ - MassiveIntentClassification (fr)
207
+ - MassiveScenarioClassification (fr)
208
+ - MTOPDomainClassification (fr)
209
+ - MTOPIntentClassification (fr)
210
+ Clustering:
211
+ - AlloProfClusteringP2P
212
+ - AlloProfClusteringS2S
213
+ - HALClusteringS2S
214
+ - MLSUMClusteringP2P
215
+ - MLSUMClusteringS2S
216
+ - MasakhaNEWSClusteringP2P (fra)
217
+ - MasakhaNEWSClusteringS2S (fra)
218
+ PairClassification:
219
+ - OpusparcusPC (fr)
220
+ - PawsX (fr)
221
+ Reranking:
222
+ - AlloprofReranking
223
+ - SyntecReranking
224
+ Retrieval:
225
+ - AlloprofRetrieval
226
+ - BSARDRetrieval
227
+ - MintakaRetrieval (fr)
228
+ - SyntecRetrieval
229
+ - XPQARetrieval (fr)
230
+ STS:
231
+ - STS22 (fr)
232
+ - STSBenchmarkMultilingualSTS (fr)
233
+ - SICKFr
234
+ Summarization:
235
+ - SummEvalFr
236
+ 'no':
237
+ title: Norwegian
238
+ language_long: "Norwegian Bokmål"
239
+ has_overall: false
240
+ acronym: null
241
+ icon: "🇳🇴"
242
+ special_icons:
243
+ Classification: "💙"
244
+ credits: "[Kenneth Enevoldsen](https://github.com/KennethEnevoldsen), [scandinavian-embedding-benchmark](https://kennethenevoldsen.github.io/scandinavian-embedding-benchmark/)"
245
+ tasks:
246
+ Classification: &id001
247
+ - NoRecClassification
248
+ - NordicLangClassification
249
+ - NorwegianParliament
250
+ - MassiveIntentClassification (nb)
251
+ - MassiveScenarioClassification (nb)
252
+ - ScalaNbClassification
253
+ law:
254
+ title: Law
255
+ language_long: "English, German, Chinese"
256
+ has_overall: false
257
+ acronym: null
258
+ icon: "⚖️"
259
+ special_icons: null
260
+ credits: "[Voyage AI](https://www.voyageai.com/)"
261
+ tasks:
262
+ Retrieval:
263
+ - AILACasedocs
264
+ - AILAStatutes
265
+ - GerDaLIRSmall
266
+ - LeCaRDv2
267
+ - LegalBenchConsumerContractsQA
268
+ - LegalBenchCorporateLobbying
269
+ - LegalQuAD
270
+ - LegalSummarization
271
+ de:
272
+ title: German
273
+ language_long: "German"
274
+ has_overall: false
275
+ acronym: null
276
+ icon: "🇩🇪"
277
+ special_icons: null
278
+ credits: "[Silvan](https://github.com/slvnwhrl)"
279
+ tasks:
280
+ Clustering:
281
+ - BlurbsClusteringP2P
282
+ - BlurbsClusteringS2S
283
+ - TenKGnadClusteringP2P
284
+ - TenKGnadClusteringS2S
285
+ pl:
286
+ title: Polish
287
+ language_long: Polish
288
+ has_overall: true
289
+ acronym: null
290
+ icon: "🇵🇱"
291
+ special_icons:
292
+ Classification: "🤍"
293
+ credits: "[Rafał Poświata](https://github.com/rafalposwiata)"
294
+ tasks:
295
+ Classification:
296
+ - AllegroReviews
297
+ - CBD
298
+ - MassiveIntentClassification (pl)
299
+ - MassiveScenarioClassification (pl)
300
+ - PAC
301
+ - PolEmo2.0-IN
302
+ - PolEmo2.0-OUT
303
+ Clustering:
304
+ - 8TagsClustering
305
+ PairClassification:
306
+ - CDSC-E
307
+ - PPC
308
+ - PSC
309
+ - SICK-E-PL
310
+ Retrieval:
311
+ - ArguAna-PL
312
+ - DBPedia-PL
313
+ - FiQA-PL
314
+ - HotpotQA-PL
315
+ - MSMARCO-PL
316
+ - NFCorpus-PL
317
+ - NQ-PL
318
+ - Quora-PL
319
+ - SCIDOCS-PL
320
+ - SciFact-PL
321
+ - TRECCOVID-PL
322
+ STS:
323
+ - CDSC-R
324
+ - SICK-R-PL
325
+ - STS22 (pl)
326
+ se:
327
+ title: Swedish
328
+ language_long: Swedish
329
+ has_overall: false
330
+ acronym: null
331
+ icon: "🇸🇪"
332
+ special_icons:
333
+ Classification: "💛"
334
+ credits: "[Kenneth Enevoldsen](https://github.com/KennethEnevoldsen), [scandinavian-embedding-benchmark](https://kennethenevoldsen.github.io/scandinavian-embedding-benchmark/)"
335
+ tasks:
336
+ Classification:
337
+ - NoRecClassification
338
+ - NordicLangClassification
339
+ - NorwegianParliament
340
+ - MassiveIntentClassification (nb)
341
+ - MassiveScenarioClassification (nb)
342
+ - ScalaNbClassification
343
+ other-cls:
344
+ title: "Other Languages"
345
+ language_long: "47 (Only languages not included in the other tabs)"
346
+ has_overall: false
347
+ acronym: null
348
+ icon: null
349
+ special_icons:
350
+ Classification: "💜💚💙"
351
+ credits: null
352
+ tasks:
353
+ Classification: ['AmazonCounterfactualClassification (de)', 'AmazonCounterfactualClassification (ja)', 'AmazonReviewsClassification (de)', 'AmazonReviewsClassification (es)', 'AmazonReviewsClassification (fr)', 'AmazonReviewsClassification (ja)', 'AmazonReviewsClassification (zh)', 'MTOPDomainClassification (de)', 'MTOPDomainClassification (es)', 'MTOPDomainClassification (fr)', 'MTOPDomainClassification (hi)', 'MTOPDomainClassification (th)', 'MTOPIntentClassification (de)', 'MTOPIntentClassification (es)', 'MTOPIntentClassification (fr)', 'MTOPIntentClassification (hi)', 'MTOPIntentClassification (th)', 'MassiveIntentClassification (af)', 'MassiveIntentClassification (am)', 'MassiveIntentClassification (ar)', 'MassiveIntentClassification (az)', 'MassiveIntentClassification (bn)', 'MassiveIntentClassification (cy)', 'MassiveIntentClassification (de)', 'MassiveIntentClassification (el)', 'MassiveIntentClassification (es)', 'MassiveIntentClassification (fa)', 'MassiveIntentClassification (fi)', 'MassiveIntentClassification (fr)', 'MassiveIntentClassification (he)', 'MassiveIntentClassification (hi)', 'MassiveIntentClassification (hu)', 'MassiveIntentClassification (hy)', 'MassiveIntentClassification (id)', 'MassiveIntentClassification (is)', 'MassiveIntentClassification (it)', 'MassiveIntentClassification (ja)', 'MassiveIntentClassification (jv)', 'MassiveIntentClassification (ka)', 'MassiveIntentClassification (km)', 'MassiveIntentClassification (kn)', 'MassiveIntentClassification (ko)', 'MassiveIntentClassification (lv)', 'MassiveIntentClassification (ml)', 'MassiveIntentClassification (mn)', 'MassiveIntentClassification (ms)', 'MassiveIntentClassification (my)', 'MassiveIntentClassification (nl)', 'MassiveIntentClassification (pt)', 'MassiveIntentClassification (ro)', 'MassiveIntentClassification (ru)', 'MassiveIntentClassification (sl)', 'MassiveIntentClassification (sq)', 'MassiveIntentClassification (sw)', 'MassiveIntentClassification (ta)', 'MassiveIntentClassification (te)', 'MassiveIntentClassification (th)', 'MassiveIntentClassification (tl)', 'MassiveIntentClassification (tr)', 'MassiveIntentClassification (ur)', 'MassiveIntentClassification (vi)', 'MassiveIntentClassification (zh-TW)', 'MassiveScenarioClassification (af)', 'MassiveScenarioClassification (am)', 'MassiveScenarioClassification (ar)', 'MassiveScenarioClassification (az)', 'MassiveScenarioClassification (bn)', 'MassiveScenarioClassification (cy)', 'MassiveScenarioClassification (de)', 'MassiveScenarioClassification (el)', 'MassiveScenarioClassification (es)', 'MassiveScenarioClassification (fa)', 'MassiveScenarioClassification (fi)', 'MassiveScenarioClassification (fr)', 'MassiveScenarioClassification (he)', 'MassiveScenarioClassification (hi)', 'MassiveScenarioClassification (hu)', 'MassiveScenarioClassification (hy)', 'MassiveScenarioClassification (id)', 'MassiveScenarioClassification (is)', 'MassiveScenarioClassification (it)', 'MassiveScenarioClassification (ja)', 'MassiveScenarioClassification (jv)', 'MassiveScenarioClassification (ka)', 'MassiveScenarioClassification (km)', 'MassiveScenarioClassification (kn)', 'MassiveScenarioClassification (ko)', 'MassiveScenarioClassification (lv)', 'MassiveScenarioClassification (ml)', 'MassiveScenarioClassification (mn)', 'MassiveScenarioClassification (ms)', 'MassiveScenarioClassification (my)', 'MassiveScenarioClassification (nl)', 'MassiveScenarioClassification (pt)', 'MassiveScenarioClassification (ro)', 'MassiveScenarioClassification (ru)', 'MassiveScenarioClassification (sl)', 'MassiveScenarioClassification (sq)', 'MassiveScenarioClassification (sw)', 'MassiveScenarioClassification (ta)', 'MassiveScenarioClassification (te)', 'MassiveScenarioClassification (th)', 'MassiveScenarioClassification (tl)', 'MassiveScenarioClassification (tr)', 'MassiveScenarioClassification (ur)', 'MassiveScenarioClassification (vi)', 'MassiveScenarioClassification (zh-TW)']
354
+ other-sts:
355
+ title: Other
356
+ language_long: "Arabic, Chinese, Dutch, English, French, German, Italian, Korean, Polish, Russian, Spanish (Only language combos not included in the other tabs)"
357
+ has_overall: false
358
+ acronym: null
359
+ icon: null
360
+ special_icons:
361
+ STS: "👽"
362
+ credits: null
363
+ tasks:
364
+ STS: ["STS17 (ar-ar)", "STS17 (en-ar)", "STS17 (en-de)", "STS17 (en-tr)", "STS17 (es-en)", "STS17 (es-es)", "STS17 (fr-en)", "STS17 (it-en)", "STS17 (ko-ko)", "STS17 (nl-en)", "STS22 (ar)", "STS22 (de)", "STS22 (de-en)", "STS22 (de-fr)", "STS22 (de-pl)", "STS22 (es)", "STS22 (es-en)", "STS22 (es-it)", "STS22 (fr)", "STS22 (fr-pl)", "STS22 (it)", "STS22 (pl)", "STS22 (pl-en)", "STS22 (ru)", "STS22 (tr)", "STS22 (zh-en)", "STSBenchmark"]
envs.py ADDED
@@ -0,0 +1,48 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ import os
2
+ from yaml import safe_load
3
+
4
+ from huggingface_hub import HfApi
5
+
6
+ LEADERBOARD_CONFIG_PATH = "config.yaml"
7
+ with open(LEADERBOARD_CONFIG_PATH, 'r', encoding='utf-8') as f:
8
+ LEADERBOARD_CONFIG = safe_load(f)
9
+ MODEL_META_PATH = "model_meta.yaml"
10
+ with open(MODEL_META_PATH, 'r', encoding='utf-8') as f:
11
+ MODEL_META = safe_load(f)
12
+
13
+ # Try first to get the config from the environment variables, then from the config.yaml file
14
+ def get_config(name, default):
15
+ res = None
16
+
17
+ if name in os.environ:
18
+ res = os.environ[name]
19
+ elif 'config' in LEADERBOARD_CONFIG:
20
+ res = LEADERBOARD_CONFIG['config'].get(name, None)
21
+
22
+ if res is None:
23
+ return default
24
+ return res
25
+
26
+ def str2bool(v):
27
+ return str(v).lower() in ("yes", "true", "t", "1")
28
+
29
+ # clone / pull the lmeh eval data
30
+ HF_TOKEN = get_config("HF_TOKEN", None)
31
+
32
+ LEADERBOARD_NAME = get_config("LEADERBOARD_NAME", "MTEB Leaderboard")
33
+
34
+ REPO_ID = get_config("REPO_ID", "mteb/leaderboard")
35
+ RESULTS_REPO = get_config("RESULTS_REPO", "mteb/results")
36
+
37
+ CACHE_PATH=get_config("HF_HOME", ".")
38
+ os.environ["HF_HOME"] = CACHE_PATH
39
+
40
+ # Check if it is using persistent storage
41
+ if not os.access(CACHE_PATH, os.W_OK):
42
+ print(f"No write access to HF_HOME: {CACHE_PATH}. Resetting to current directory.")
43
+ CACHE_PATH = "."
44
+ os.environ["HF_HOME"] = CACHE_PATH
45
+ else:
46
+ print(f"Write access confirmed for HF_HOME")
47
+
48
+ API = HfApi(token=HF_TOKEN)
model_meta.yaml ADDED
@@ -0,0 +1,1160 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ model_meta:
2
+ Baichuan-text-embedding:
3
+ link: https://platform.baichuan-ai.com/docs/text-Embedding
4
+ seq_len: 512
5
+ size: null
6
+ dim: 1024
7
+ is_external: true
8
+ is_proprietary: true
9
+ is_sentence_transformers_compatible: false
10
+ Cohere-embed-english-v3.0:
11
+ link: https://huggingface.co/Cohere/Cohere-embed-english-v3.0
12
+ seq_len: 512
13
+ size: null
14
+ dim: 1024
15
+ is_external: true
16
+ is_proprietary: true
17
+ is_sentence_transformers_compatible: false
18
+ Cohere-embed-multilingual-light-v3.0:
19
+ link: https://huggingface.co/Cohere/Cohere-embed-multilingual-light-v3.0
20
+ seq_len: 512
21
+ size: null
22
+ dim: 384
23
+ is_external: true
24
+ is_proprietary: true
25
+ is_sentence_transformers_compatible: false
26
+ Cohere-embed-multilingual-v3.0:
27
+ link: https://huggingface.co/Cohere/Cohere-embed-multilingual-v3.0
28
+ seq_len: 512
29
+ size: null
30
+ dim: 1024
31
+ is_external: true
32
+ is_proprietary: true
33
+ is_sentence_transformers_compatible: false
34
+ DanskBERT:
35
+ link: https://huggingface.co/vesteinn/DanskBERT
36
+ seq_len: 514
37
+ size: 125
38
+ dim: 768
39
+ is_external: true
40
+ is_proprietary: false
41
+ is_sentence_transformers_compatible: true
42
+ LASER2:
43
+ link: https://github.com/facebookresearch/LASER
44
+ seq_len: N/A
45
+ size: 43
46
+ dim: 1024
47
+ is_external: true
48
+ is_proprietary: false
49
+ is_sentence_transformers_compatible: false
50
+ LLM2Vec-Llama-supervised:
51
+ link: https://huggingface.co/McGill-NLP/LLM2Vec-Llama-2-7b-chat-hf-mntp-supervised
52
+ seq_len: 4096
53
+ size: 6607
54
+ dim: 4096
55
+ is_external: true
56
+ is_proprietary: false
57
+ is_sentence_transformers_compatible: false
58
+ LLM2Vec-Llama-unsupervised:
59
+ link: https://huggingface.co/McGill-NLP/LLM2Vec-Llama-2-7b-chat-hf-mntp
60
+ seq_len: 4096
61
+ size: 6607
62
+ dim: 4096
63
+ is_external: true
64
+ is_proprietary: false
65
+ is_sentence_transformers_compatible: false
66
+ LLM2Vec-Mistral-supervised:
67
+ link: https://huggingface.co/McGill-NLP/LLM2Vec-Mistral-7B-Instruct-v2-mntp-supervised
68
+ seq_len: 32768
69
+ size: 7111
70
+ dim: 4096
71
+ is_external: true
72
+ is_proprietary: false
73
+ is_sentence_transformers_compatible: false
74
+ LLM2Vec-Mistral-unsupervised:
75
+ link: https://huggingface.co/McGill-NLP/LLM2Vec-Mistral-7B-Instruct-v2-mntp
76
+ seq_len: 32768
77
+ size: 7111
78
+ dim: 4096
79
+ is_external: true
80
+ is_proprietary: false
81
+ is_sentence_transformers_compatible: false
82
+ LLM2Vec-Sheared-Llama-supervised:
83
+ link: https://huggingface.co/McGill-NLP/LLM2Vec-Sheared-LLaMA-mntp-supervised
84
+ seq_len: 4096
85
+ size: 1280
86
+ dim: 2048
87
+ is_external: true
88
+ is_proprietary: false
89
+ is_sentence_transformers_compatible: false
90
+ LLM2Vec-Sheared-Llama-unsupervised:
91
+ link: https://huggingface.co/McGill-NLP/LLM2Vec-Sheared-LLaMA-mntp
92
+ seq_len: 4096
93
+ size: 1280
94
+ dim: 2048
95
+ is_external: true
96
+ is_proprietary: false
97
+ is_sentence_transformers_compatible: false
98
+ LaBSE:
99
+ link: https://huggingface.co/sentence-transformers/LaBSE
100
+ seq_len: 512
101
+ size: 471
102
+ dim: 768
103
+ is_external: true
104
+ is_proprietary: false
105
+ is_sentence_transformers_compatible: true
106
+ OpenSearch-text-hybrid:
107
+ link: https://help.aliyun.com/zh/open-search/vector-search-edition/hybrid-retrieval
108
+ seq_len: 512
109
+ size: null
110
+ dim: 1792
111
+ is_external: true
112
+ is_proprietary: true
113
+ is_sentence_transformers_compatible: false
114
+ all-MiniLM-L12-v2:
115
+ link: https://huggingface.co/sentence-transformers/all-MiniLM-L12-v2
116
+ seq_len: 512
117
+ size: 33
118
+ dim: 384
119
+ is_external: true
120
+ is_proprietary: false
121
+ is_sentence_transformers_compatible: true
122
+ all-MiniLM-L6-v2:
123
+ link: https://huggingface.co/sentence-transformers/all-MiniLM-L6-v2
124
+ seq_len: 512
125
+ size: 23
126
+ dim: 384
127
+ is_external: true
128
+ is_proprietary: false
129
+ is_sentence_transformers_compatible: true
130
+ all-mpnet-base-v2:
131
+ link: https://huggingface.co/sentence-transformers/all-mpnet-base-v2
132
+ seq_len: 514
133
+ size: 110
134
+ dim: 768
135
+ is_external: true
136
+ is_proprietary: false
137
+ is_sentence_transformers_compatible: true
138
+ allenai-specter:
139
+ link: https://huggingface.co/sentence-transformers/allenai-specter
140
+ seq_len: 512
141
+ size: 110
142
+ dim: 768
143
+ is_external: true
144
+ is_proprietary: false
145
+ is_sentence_transformers_compatible: true
146
+ bert-base-10lang-cased:
147
+ link: https://huggingface.co/Geotrend/bert-base-10lang-cased
148
+ seq_len: 512
149
+ size: 138
150
+ dim: 768
151
+ is_external: true
152
+ is_proprietary: false
153
+ is_sentence_transformers_compatible: true
154
+ bert-base-15lang-cased:
155
+ link: https://huggingface.co/Geotrend/bert-base-15lang-cased
156
+ seq_len: 512
157
+ size: 138
158
+ dim: 768
159
+ is_external: true
160
+ is_proprietary: false
161
+ is_sentence_transformers_compatible: true
162
+ bert-base-25lang-cased:
163
+ link: https://huggingface.co/Geotrend/bert-base-25lang-cased
164
+ seq_len: 512
165
+ size: 138
166
+ dim: 768
167
+ is_external: true
168
+ is_proprietary: false
169
+ is_sentence_transformers_compatible: true
170
+ bert-base-multilingual-cased:
171
+ link: https://huggingface.co/google-bert/bert-base-multilingual-cased
172
+ seq_len: 512
173
+ size: 179
174
+ dim: 768
175
+ is_external: true
176
+ is_proprietary: false
177
+ is_sentence_transformers_compatible: true
178
+ bert-base-multilingual-uncased:
179
+ link: https://huggingface.co/google-bert/bert-base-multilingual-uncased
180
+ seq_len: 512
181
+ size: 168
182
+ dim: 768
183
+ is_external: true
184
+ is_proprietary: false
185
+ is_sentence_transformers_compatible: true
186
+ bert-base-swedish-cased:
187
+ link: https://huggingface.co/KB/bert-base-swedish-cased
188
+ seq_len: 512
189
+ size: 125
190
+ dim: 768
191
+ is_external: true
192
+ is_proprietary: false
193
+ is_sentence_transformers_compatible: true
194
+ bert-base-uncased:
195
+ link: https://huggingface.co/bert-base-uncased
196
+ seq_len: 512
197
+ size: 110
198
+ dim: 768
199
+ is_external: true
200
+ is_proprietary: false
201
+ is_sentence_transformers_compatible: true
202
+ bge-base-zh-v1.5:
203
+ link: https://huggingface.co/BAAI/bge-base-zh-v1.5
204
+ seq_len: 512
205
+ size: 102
206
+ dim: 768
207
+ is_external: true
208
+ is_proprietary: false
209
+ is_sentence_transformers_compatible: true
210
+ bge-large-en-v1.5:
211
+ link: https://huggingface.co/BAAI/bge-large-en-v1.5
212
+ seq_len: 512
213
+ size: null
214
+ dim: 1024
215
+ is_external: true
216
+ is_proprietary: false
217
+ is_sentence_transformers_compatible: false
218
+ bge-large-zh-noinstruct:
219
+ link: https://huggingface.co/BAAI/bge-large-zh-noinstruct
220
+ seq_len: 512
221
+ size: 326
222
+ dim: 1024
223
+ is_external: true
224
+ is_proprietary: false
225
+ is_sentence_transformers_compatible: true
226
+ bge-large-zh-v1.5:
227
+ link: https://huggingface.co/BAAI/bge-large-zh-v1.5
228
+ seq_len: 512
229
+ size: 326
230
+ dim: 1024
231
+ is_external: true
232
+ is_proprietary: false
233
+ is_sentence_transformers_compatible: true
234
+ bge-small-zh-v1.5:
235
+ link: https://huggingface.co/BAAI/bge-small-zh-v1.5
236
+ seq_len: 512
237
+ size: 24
238
+ dim: 512
239
+ is_external: true
240
+ is_proprietary: false
241
+ is_sentence_transformers_compatible: true
242
+ camembert-base:
243
+ link: https://huggingface.co/almanach/camembert-base
244
+ seq_len: 512
245
+ size: 111
246
+ dim: 512
247
+ is_external: false
248
+ is_proprietary: false
249
+ is_sentence_transformers_compatible: true
250
+ camembert-large:
251
+ link: https://huggingface.co/almanach/camembert-large
252
+ seq_len: 512
253
+ size: 338
254
+ dim: 768
255
+ is_external: false
256
+ is_proprietary: false
257
+ is_sentence_transformers_compatible: true
258
+ contriever-base-msmarco:
259
+ link: https://huggingface.co/nthakur/contriever-base-msmarco
260
+ seq_len: 512
261
+ size: 110
262
+ dim: 768
263
+ is_external: true
264
+ is_proprietary: false
265
+ is_sentence_transformers_compatible: true
266
+ cross-en-de-roberta-sentence-transformer:
267
+ link: https://huggingface.co/T-Systems-onsite/cross-en-de-roberta-sentence-transformer
268
+ seq_len: 514
269
+ size: 278
270
+ dim: 768
271
+ is_external: true
272
+ is_proprietary: false
273
+ is_sentence_transformers_compatible: true
274
+ dfm-encoder-large-v1:
275
+ link: https://huggingface.co/chcaa/dfm-encoder-large-v1
276
+ seq_len: 512
277
+ size: 355
278
+ dim: 1024
279
+ is_external: true
280
+ is_proprietary: false
281
+ is_sentence_transformers_compatible: true
282
+ dfm-sentence-encoder-large-1:
283
+ link: https://huggingface.co/chcaa/dfm-encoder-large-v1
284
+ seq_len: 512
285
+ size: 355
286
+ dim: 1024
287
+ is_external: true
288
+ is_proprietary: false
289
+ is_sentence_transformers_compatible: true
290
+ distilbert-base-25lang-cased:
291
+ link: https://huggingface.co/Geotrend/distilbert-base-25lang-cased
292
+ seq_len: 512
293
+ size: 110
294
+ dim: 768
295
+ is_external: false
296
+ is_proprietary: false
297
+ is_sentence_transformers_compatible: true
298
+ distilbert-base-en-fr-cased:
299
+ link: https://huggingface.co/Geotrend/distilbert-base-en-fr-cased
300
+ seq_len: 512
301
+ size: 110
302
+ dim: 768
303
+ is_external: false
304
+ is_proprietary: false
305
+ is_sentence_transformers_compatible: true
306
+ distilbert-base-en-fr-es-pt-it-cased:
307
+ link: https://huggingface.co/Geotrend/distilbert-base-en-fr-es-pt-it-cased
308
+ seq_len: 512
309
+ size: 110
310
+ dim: 768
311
+ is_external: false
312
+ is_proprietary: false
313
+ is_sentence_transformers_compatible: true
314
+ distilbert-base-fr-cased:
315
+ link: https://huggingface.co/Geotrend/distilbert-base-fr-cased
316
+ seq_len: 512
317
+ size: 110
318
+ dim: 768
319
+ is_external: false
320
+ is_proprietary: false
321
+ is_sentence_transformers_compatible: true
322
+ distilbert-base-uncased:
323
+ link: https://huggingface.co/distilbert-base-uncased
324
+ seq_len: 512
325
+ size: 110
326
+ dim: 768
327
+ is_external: false
328
+ is_proprietary: false
329
+ is_sentence_transformers_compatible: true
330
+ distiluse-base-multilingual-cased-v2:
331
+ link: https://huggingface.co/sentence-transformers/distiluse-base-multilingual-cased-v2
332
+ seq_len: 512
333
+ size: 135
334
+ dim: 512
335
+ is_external: true
336
+ is_proprietary: false
337
+ is_sentence_transformers_compatible: true
338
+ e5-base:
339
+ link: https://huggingface.co/intfloat/e5-base
340
+ seq_len: 512
341
+ size: 110
342
+ dim: 768
343
+ is_external: true
344
+ is_proprietary: false
345
+ is_sentence_transformers_compatible: true
346
+ e5-large:
347
+ link: https://huggingface.co/intfloat/e5-large
348
+ seq_len: 512
349
+ size: 335
350
+ dim: 1024
351
+ is_external: true
352
+ is_proprietary: false
353
+ is_sentence_transformers_compatible: true
354
+ e5-mistral-7b-instruct:
355
+ link: https://huggingface.co/intfloat/e5-mistral-7b-instruct
356
+ seq_len: 32768
357
+ size: 7111
358
+ dim: 4096
359
+ is_external: true
360
+ is_proprietary: false
361
+ is_sentence_transformers_compatible: true
362
+ e5-small:
363
+ link: https://huggingface.co/intfloat/e5-small
364
+ seq_len: 512
365
+ size: 33
366
+ dim: 384
367
+ is_external: true
368
+ is_proprietary: false
369
+ is_sentence_transformers_compatible: true
370
+ electra-small-nordic:
371
+ link: https://huggingface.co/jonfd/electra-small-nordic
372
+ seq_len: 512
373
+ size: 23
374
+ dim: 256
375
+ is_external: true
376
+ is_proprietary: false
377
+ is_sentence_transformers_compatible: true
378
+ electra-small-swedish-cased-discriminator:
379
+ link: https://huggingface.co/KBLab/electra-small-swedish-cased-discriminator
380
+ seq_len: 512
381
+ size: 16
382
+ dim: 256
383
+ is_external: true
384
+ is_proprietary: false
385
+ is_sentence_transformers_compatible: true
386
+ flaubert_base_cased:
387
+ link: https://huggingface.co/flaubert/flaubert_base_cased
388
+ seq_len: 512
389
+ size: 138
390
+ dim: 768
391
+ is_external: true
392
+ is_proprietary: false
393
+ is_sentence_transformers_compatible: true
394
+ flaubert_base_uncased:
395
+ link: https://huggingface.co/flaubert/flaubert_base_uncased
396
+ seq_len: 512
397
+ size: 138
398
+ dim: 768
399
+ is_external: true
400
+ is_proprietary: false
401
+ is_sentence_transformers_compatible: true
402
+ flaubert_large_cased:
403
+ link: https://huggingface.co/flaubert/flaubert_large_cased
404
+ seq_len: 512
405
+ size: 372
406
+ dim: 1024
407
+ is_external: true
408
+ is_proprietary: false
409
+ is_sentence_transformers_compatible: true
410
+ gbert-base:
411
+ link: https://huggingface.co/deepset/gbert-base
412
+ seq_len: 512
413
+ size: 110
414
+ dim: 768
415
+ is_external: true
416
+ is_proprietary: false
417
+ is_sentence_transformers_compatible: true
418
+ gbert-large:
419
+ link: https://huggingface.co/deepset/gbert-large
420
+ seq_len: 512
421
+ size: 337
422
+ dim: 1024
423
+ is_external: true
424
+ is_proprietary: false
425
+ is_sentence_transformers_compatible: true
426
+ gelectra-base:
427
+ link: https://huggingface.co/deepset/gelectra-base
428
+ seq_len: 512
429
+ size: 110
430
+ dim: 768
431
+ is_external: true
432
+ is_proprietary: false
433
+ is_sentence_transformers_compatible: true
434
+ gelectra-large:
435
+ link: https://huggingface.co/deepset/gelectra-large
436
+ seq_len: 512
437
+ size: 335
438
+ dim: 1024
439
+ is_external: true
440
+ is_proprietary: false
441
+ is_sentence_transformers_compatible: true
442
+ glove.6B.300d:
443
+ link: https://huggingface.co/sentence-transformers/average_word_embeddings_glove.6B.300d
444
+ seq_len: N/A
445
+ size: 120
446
+ dim: 300
447
+ is_external: true
448
+ is_proprietary: false
449
+ is_sentence_transformers_compatible: true
450
+ google-gecko-256.text-embedding-preview-0409:
451
+ link: https://cloud.google.com/vertex-ai/generative-ai/docs/embeddings/get-text-embeddings#latest_models
452
+ seq_len: 2048
453
+ size: 1200
454
+ dim: 256
455
+ is_external: true
456
+ is_proprietary: true
457
+ is_sentence_transformers_compatible: false
458
+ google-gecko.text-embedding-preview-0409:
459
+ link: https://cloud.google.com/vertex-ai/generative-ai/docs/embeddings/get-text-embeddings#latest_models
460
+ seq_len: 2048
461
+ size: 1200
462
+ dim: 768
463
+ is_external: true
464
+ is_proprietary: true
465
+ is_sentence_transformers_compatible: false
466
+ gottbert-base:
467
+ link: https://huggingface.co/uklfr/gottbert-base
468
+ seq_len: 512
469
+ size: 127
470
+ dim: 768
471
+ is_external: true
472
+ is_proprietary: false
473
+ is_sentence_transformers_compatible: true
474
+ gtr-t5-base:
475
+ link: https://huggingface.co/sentence-transformers/gtr-t5-base
476
+ seq_len: 512
477
+ size: 110
478
+ dim: 768
479
+ is_external: true
480
+ is_proprietary: false
481
+ is_sentence_transformers_compatible: true
482
+ gtr-t5-large:
483
+ link: https://huggingface.co/sentence-transformers/gtr-t5-large
484
+ seq_len: 512
485
+ size: 168
486
+ dim: 768
487
+ is_external: true
488
+ is_proprietary: false
489
+ is_sentence_transformers_compatible: true
490
+ gtr-t5-xl:
491
+ link: https://huggingface.co/sentence-transformers/gtr-t5-xl
492
+ seq_len: 512
493
+ size: 1240
494
+ dim: 768
495
+ is_external: true
496
+ is_proprietary: false
497
+ is_sentence_transformers_compatible: true
498
+ gtr-t5-xxl:
499
+ link: https://huggingface.co/sentence-transformers/gtr-t5-xxl
500
+ seq_len: 512
501
+ size: 4865
502
+ dim: 768
503
+ is_external: true
504
+ is_proprietary: false
505
+ is_sentence_transformers_compatible: true
506
+ herbert-base-retrieval-v2:
507
+ link: https://huggingface.co/ipipan/herbert-base-retrieval-v2
508
+ seq_len: 514
509
+ size: 125
510
+ dim: 768
511
+ is_external: true
512
+ is_proprietary: false
513
+ is_sentence_transformers_compatible: true
514
+ komninos:
515
+ link: https://huggingface.co/sentence-transformers/average_word_embeddings_komninos
516
+ seq_len: N/A
517
+ size: 134
518
+ dim: 300
519
+ is_external: true
520
+ is_proprietary: false
521
+ is_sentence_transformers_compatible: true
522
+ luotuo-bert-medium:
523
+ link: https://huggingface.co/silk-road/luotuo-bert-medium
524
+ seq_len: 512
525
+ size: 328
526
+ dim: 768
527
+ is_external: true
528
+ is_proprietary: false
529
+ is_sentence_transformers_compatible: true
530
+ m3e-base:
531
+ link: https://huggingface.co/moka-ai/m3e-base
532
+ seq_len: 512
533
+ size: 102
534
+ dim: 768
535
+ is_external: true
536
+ is_proprietary: false
537
+ is_sentence_transformers_compatible: true
538
+ m3e-large:
539
+ link: https://huggingface.co/moka-ai/m3e-large
540
+ seq_len: 512
541
+ size: 102
542
+ dim: 768
543
+ is_external: true
544
+ is_proprietary: false
545
+ is_sentence_transformers_compatible: true
546
+ mistral-embed:
547
+ link: https://docs.mistral.ai/guides/embeddings
548
+ seq_len: null
549
+ size: null
550
+ dim: 1024
551
+ is_external: true
552
+ is_proprietary: true
553
+ is_sentence_transformers_compatible: false
554
+ msmarco-bert-co-condensor:
555
+ link: https://huggingface.co/sentence-transformers/msmarco-bert-co-condensor
556
+ seq_len: 512
557
+ size: 110
558
+ dim: 768
559
+ is_external: true
560
+ is_proprietary: false
561
+ is_sentence_transformers_compatible: true
562
+ multi-qa-MiniLM-L6-cos-v1:
563
+ link: https://huggingface.co/sentence-transformers/multi-qa-MiniLM-L6-cos-v1
564
+ seq_len: 512
565
+ size: 23
566
+ dim: 384
567
+ is_external: true
568
+ is_proprietary: false
569
+ is_sentence_transformers_compatible: true
570
+ multilingual-e5-base:
571
+ link: https://huggingface.co/intfloat/multilingual-e5-base
572
+ seq_len: 514
573
+ size: 278
574
+ dim: 768
575
+ is_external: true
576
+ is_proprietary: false
577
+ is_sentence_transformers_compatible: true
578
+ multilingual-e5-large:
579
+ link: https://huggingface.co/intfloat/multilingual-e5-large
580
+ seq_len: 514
581
+ size: 560
582
+ dim: 1024
583
+ is_external: true
584
+ is_proprietary: false
585
+ is_sentence_transformers_compatible: true
586
+ multilingual-e5-small:
587
+ link: https://huggingface.co/intfloat/multilingual-e5-small
588
+ seq_len: 512
589
+ size: 118
590
+ dim: 384
591
+ is_external: true
592
+ is_proprietary: false
593
+ is_sentence_transformers_compatible: true
594
+ nb-bert-base:
595
+ link: https://huggingface.co/NbAiLab/nb-bert-base
596
+ seq_len: 512
597
+ size: 179
598
+ dim: 768
599
+ is_external: true
600
+ is_proprietary: false
601
+ is_sentence_transformers_compatible: true
602
+ nb-bert-large:
603
+ link: https://huggingface.co/NbAiLab/nb-bert-large
604
+ seq_len: 512
605
+ size: 355
606
+ dim: 1024
607
+ is_external: true
608
+ is_proprietary: false
609
+ is_sentence_transformers_compatible: true
610
+ nomic-embed-text-v1.5-128:
611
+ link: https://huggingface.co/nomic-ai/nomic-embed-text-v1.5
612
+ seq_len: 8192
613
+ size: 138
614
+ dim: 128
615
+ is_external: true
616
+ is_proprietary: false
617
+ is_sentence_transformers_compatible: true
618
+ nomic-embed-text-v1.5-256:
619
+ link: https://huggingface.co/nomic-ai/nomic-embed-text-v1.5
620
+ seq_len: 8192
621
+ size: 138
622
+ dim: 256
623
+ is_external: true
624
+ is_proprietary: false
625
+ is_sentence_transformers_compatible: true
626
+ nomic-embed-text-v1.5-512:
627
+ link: https://huggingface.co/nomic-ai/nomic-embed-text-v1.5
628
+ seq_len: 8192
629
+ size: 138
630
+ dim: 512
631
+ is_external: true
632
+ is_proprietary: false
633
+ is_sentence_transformers_compatible: true
634
+ nomic-embed-text-v1.5-64:
635
+ link: https://huggingface.co/nomic-ai/nomic-embed-text-v1.5
636
+ seq_len: 8192
637
+ size: 138
638
+ dim: 64
639
+ is_external: true
640
+ is_proprietary: false
641
+ is_sentence_transformers_compatible: true
642
+ norbert3-base:
643
+ link: https://huggingface.co/ltg/norbert3-base
644
+ seq_len: 512
645
+ size: 131
646
+ dim: 768
647
+ is_external: true
648
+ is_proprietary: false
649
+ is_sentence_transformers_compatible: true
650
+ norbert3-large:
651
+ link: https://huggingface.co/ltg/norbert3-large
652
+ seq_len: 512
653
+ size: 368
654
+ dim: 1024
655
+ is_external: true
656
+ is_proprietary: false
657
+ is_sentence_transformers_compatible: true
658
+ paraphrase-multilingual-MiniLM-L12-v2:
659
+ link: https://huggingface.co/sentence-transformers/paraphrase-multilingual-MiniLM-L12-v2
660
+ seq_len: 512
661
+ size: 118
662
+ dim: 384
663
+ is_external: true
664
+ is_proprietary: false
665
+ is_sentence_transformers_compatible: true
666
+ paraphrase-multilingual-mpnet-base-v2:
667
+ link: https://huggingface.co/sentence-transformers/paraphrase-multilingual-mpnet-base-v2
668
+ seq_len: 514
669
+ size: 278
670
+ dim: 768
671
+ is_external: true
672
+ is_proprietary: false
673
+ is_sentence_transformers_compatible: true
674
+ sentence-bert-swedish-cased:
675
+ link: https://huggingface.co/KBLab/sentence-bert-swedish-cased
676
+ seq_len: 512
677
+ size: 125
678
+ dim: 768
679
+ is_external: true
680
+ is_proprietary: false
681
+ is_sentence_transformers_compatible: true
682
+ sentence-camembert-base:
683
+ link: https://huggingface.co/dangvantuan/sentence-camembert-base
684
+ seq_len: 512
685
+ size: 110
686
+ dim: 768
687
+ is_external: true
688
+ is_proprietary: false
689
+ is_sentence_transformers_compatible: true
690
+ sentence-camembert-large:
691
+ link: https://huggingface.co/dangvantuan/sentence-camembert-large
692
+ seq_len: 512
693
+ size: 337
694
+ dim: 1024
695
+ is_external: true
696
+ is_proprietary: false
697
+ is_sentence_transformers_compatible: true
698
+ sentence-croissant-llm-base:
699
+ link: https://huggingface.co/Wissam42/sentence-croissant-llm-base
700
+ seq_len: 2048
701
+ size: 1280
702
+ dim: 2048
703
+ is_external: true
704
+ is_proprietary: false
705
+ is_sentence_transformers_compatible: true
706
+ sentence-t5-base:
707
+ link: https://huggingface.co/sentence-transformers/sentence-t5-base
708
+ seq_len: 512
709
+ size: 110
710
+ dim: 768
711
+ is_external: true
712
+ is_proprietary: false
713
+ is_sentence_transformers_compatible: true
714
+ sentence-t5-large:
715
+ link: https://huggingface.co/sentence-transformers/sentence-t5-large
716
+ seq_len: 512
717
+ size: 168
718
+ dim: 768
719
+ is_external: true
720
+ is_proprietary: false
721
+ is_sentence_transformers_compatible: true
722
+ sentence-t5-xl:
723
+ link: https://huggingface.co/sentence-transformers/sentence-t5-xl
724
+ seq_len: 512
725
+ size: 1240
726
+ dim: 768
727
+ is_external: true
728
+ is_proprietary: false
729
+ is_sentence_transformers_compatible: true
730
+ sentence-t5-xxl:
731
+ link: https://huggingface.co/sentence-transformers/sentence-t5-xxl
732
+ seq_len: 512
733
+ size: 4865
734
+ dim: 768
735
+ is_external: true
736
+ is_proprietary: false
737
+ is_sentence_transformers_compatible: true
738
+ silver-retriever-base-v1:
739
+ link: https://huggingface.co/ipipan/silver-retriever-base-v1
740
+ seq_len: 514
741
+ size: 125
742
+ dim: 768
743
+ is_external: true
744
+ is_proprietary: false
745
+ is_sentence_transformers_compatible: true
746
+ st-polish-paraphrase-from-distilroberta:
747
+ link: https://huggingface.co/sdadas/st-polish-paraphrase-from-distilroberta
748
+ seq_len: 514
749
+ size: 125
750
+ dim: 768
751
+ is_external: true
752
+ is_proprietary: false
753
+ is_sentence_transformers_compatible: true
754
+ st-polish-paraphrase-from-mpnet:
755
+ link: https://huggingface.co/sdadas/st-polish-paraphrase-from-mpnet
756
+ seq_len: 514
757
+ size: 125
758
+ dim: 768
759
+ is_external: true
760
+ is_proprietary: false
761
+ is_sentence_transformers_compatible: true
762
+ sup-simcse-bert-base-uncased:
763
+ link: https://huggingface.co/princeton-nlp/sup-simcse-bert-base-uncased
764
+ seq_len: 512
765
+ size: 110
766
+ dim: 768
767
+ is_external: true
768
+ is_proprietary: false
769
+ is_sentence_transformers_compatible: true
770
+ text-embedding-3-large:
771
+ link: https://openai.com/blog/new-embedding-models-and-api-updates
772
+ seq_len: 8191
773
+ size: null
774
+ dim: 3072
775
+ is_external: true
776
+ is_proprietary: true
777
+ is_sentence_transformers_compatible: false
778
+ text-embedding-3-large-256:
779
+ link: https://openai.com/blog/new-embedding-models-and-api-updates
780
+ seq_len: 8191
781
+ size: null
782
+ dim: 256
783
+ is_external: true
784
+ is_proprietary: true
785
+ is_sentence_transformers_compatible: false
786
+ text-embedding-3-small:
787
+ link: https://openai.com/blog/new-embedding-models-and-api-updates
788
+ seq_len: 8191
789
+ size: null
790
+ dim: 1536
791
+ is_external: true
792
+ is_proprietary: true
793
+ is_sentence_transformers_compatible: false
794
+ text-embedding-ada-002:
795
+ link: https://openai.com/blog/new-and-improved-embedding-model
796
+ seq_len: 8191
797
+ size: null
798
+ dim: 1536
799
+ is_external: true
800
+ is_proprietary: true
801
+ is_sentence_transformers_compatible: false
802
+ text-search-ada-001:
803
+ link: https://openai.com/blog/introducing-text-and-code-embeddings
804
+ seq_len: 2046
805
+ size: null
806
+ dim: 1024
807
+ is_external: true
808
+ is_proprietary: true
809
+ is_sentence_transformers_compatible: false
810
+ text-search-ada-doc-001:
811
+ link: https://openai.com/blog/introducing-text-and-code-embeddings
812
+ seq_len: 2046
813
+ size: null
814
+ dim: 1024
815
+ is_external: true
816
+ is_proprietary: true
817
+ is_sentence_transformers_compatible: false
818
+ text-search-ada-query-001:
819
+ link: https://openai.com/blog/introducing-text-and-code-embeddings
820
+ seq_len: 2046
821
+ size: null
822
+ dim: 1024
823
+ is_external: false
824
+ is_proprietary: true
825
+ is_sentence_transformers_compatible: false
826
+ text-search-babbage-001:
827
+ link: https://openai.com/blog/introducing-text-and-code-embeddings
828
+ seq_len: 2046
829
+ size: null
830
+ dim: 2048
831
+ is_external: true
832
+ is_proprietary: true
833
+ is_sentence_transformers_compatible: false
834
+ text-search-curie-001:
835
+ link: https://openai.com/blog/introducing-text-and-code-embeddings
836
+ seq_len: 2046
837
+ size: null
838
+ dim: 4096
839
+ is_external: true
840
+ is_proprietary: true
841
+ is_sentence_transformers_compatible: false
842
+ text-search-davinci-001:
843
+ link: https://openai.com/blog/introducing-text-and-code-embeddings
844
+ seq_len: 2046
845
+ size: null
846
+ dim: 12288
847
+ is_external: true
848
+ is_proprietary: true
849
+ is_sentence_transformers_compatible: false
850
+ text-similarity-ada-001:
851
+ link: https://openai.com/blog/introducing-text-and-code-embeddings
852
+ seq_len: 2046
853
+ size: null
854
+ dim: 1024
855
+ is_external: true
856
+ is_proprietary: true
857
+ is_sentence_transformers_compatible: false
858
+ text-similarity-babbage-001:
859
+ link: https://openai.com/blog/introducing-text-and-code-embeddings
860
+ seq_len: 2046
861
+ size: null
862
+ dim: 2048
863
+ is_external: true
864
+ is_proprietary: true
865
+ is_sentence_transformers_compatible: false
866
+ text-similarity-curie-001:
867
+ link: https://openai.com/blog/introducing-text-and-code-embeddings
868
+ seq_len: 2046
869
+ size: null
870
+ dim: 4096
871
+ is_external: true
872
+ is_proprietary: true
873
+ is_sentence_transformers_compatible: false
874
+ text-similarity-davinci-001:
875
+ link: https://openai.com/blog/introducing-text-and-code-embeddings
876
+ seq_len: 2046
877
+ size: null
878
+ dim: 12288
879
+ is_external: true
880
+ is_proprietary: true
881
+ is_sentence_transformers_compatible: false
882
+ text2vec-base-chinese:
883
+ link: https://huggingface.co/shibing624/text2vec-base-chinese
884
+ seq_len: 512
885
+ size: 102
886
+ dim: 768
887
+ is_external: true
888
+ is_proprietary: false
889
+ is_sentence_transformers_compatible: true
890
+ text2vec-base-multilingual:
891
+ link: null
892
+ seq_len: null
893
+ size: null
894
+ dim: null
895
+ is_external: true
896
+ is_proprietary: false
897
+ is_sentence_transformers_compatible: false
898
+ text2vec-large-chinese:
899
+ link: https://huggingface.co/GanymedeNil/text2vec-large-chinese
900
+ seq_len: 512
901
+ size: 326
902
+ dim: 1024
903
+ is_external: true
904
+ is_proprietary: false
905
+ is_sentence_transformers_compatible: true
906
+ titan-embed-text-v1:
907
+ link: https://docs.aws.amazon.com/bedrock/latest/userguide/embeddings.html
908
+ seq_len: 8000
909
+ size: null
910
+ dim: 1536
911
+ is_external: true
912
+ is_proprietary: true
913
+ is_sentence_transformers_compatible: false
914
+ udever-bloom-1b1:
915
+ link: https://huggingface.co/izhx/udever-bloom-1b1
916
+ seq_len: 2048
917
+ size: null
918
+ dim: 1536
919
+ is_external: true
920
+ is_proprietary: false
921
+ is_sentence_transformers_compatible: true
922
+ udever-bloom-560m:
923
+ link: https://huggingface.co/izhx/udever-bloom-560m
924
+ seq_len: 2048
925
+ size: null
926
+ dim: 1024
927
+ is_external: true
928
+ is_proprietary: false
929
+ is_sentence_transformers_compatible: true
930
+ universal-sentence-encoder-multilingual-3:
931
+ link: https://huggingface.co/vprelovac/universal-sentence-encoder-multilingual-3
932
+ seq_len: 512
933
+ size: null
934
+ dim: 512
935
+ is_external: true
936
+ is_proprietary: false
937
+ is_sentence_transformers_compatible: true
938
+ universal-sentence-encoder-multilingual-large-3:
939
+ link: https://huggingface.co/vprelovac/universal-sentence-encoder-multilingual-large-3
940
+ seq_len: 512
941
+ size: null
942
+ dim: 512
943
+ is_external: true
944
+ is_proprietary: false
945
+ is_sentence_transformers_compatible: true
946
+ unsup-simcse-bert-base-uncased:
947
+ link: https://huggingface.co/princeton-nlp/unsup-simcse-bert-base-uncased
948
+ seq_len: 512
949
+ size: 110
950
+ dim: 768
951
+ is_external: true
952
+ is_proprietary: false
953
+ is_sentence_transformers_compatible: true
954
+ use-cmlm-multilingual:
955
+ link: https://huggingface.co/sentence-transformers/use-cmlm-multilingual
956
+ seq_len: 512
957
+ size: 472
958
+ dim: 768
959
+ is_external: true
960
+ is_proprietary: false
961
+ is_sentence_transformers_compatible: true
962
+ voyage-2:
963
+ link: https://docs.voyageai.com/embeddings/
964
+ seq_len: 1024
965
+ size: null
966
+ dim: 1024
967
+ is_external: true
968
+ is_proprietary: true
969
+ is_sentence_transformers_compatible: false
970
+ voyage-code-2:
971
+ link: https://docs.voyageai.com/embeddings/
972
+ seq_len: 16000
973
+ size: null
974
+ dim: 1536
975
+ is_external: true
976
+ is_proprietary: true
977
+ is_sentence_transformers_compatible: false
978
+ voyage-large-2-instruct:
979
+ link: https://docs.voyageai.com/embeddings/
980
+ seq_len: 16000
981
+ size: null
982
+ dim: 1024
983
+ is_external: true
984
+ is_proprietary: false
985
+ is_sentence_transformers_compatible: false
986
+ voyage-law-2:
987
+ link: https://docs.voyageai.com/embeddings/
988
+ seq_len: 4000
989
+ size: null
990
+ dim: 1024
991
+ is_external: true
992
+ is_proprietary: true
993
+ is_sentence_transformers_compatible: false
994
+ voyage-lite-01-instruct:
995
+ link: https://docs.voyageai.com/embeddings/
996
+ seq_len: 4000
997
+ size: null
998
+ dim: 1024
999
+ is_external: true
1000
+ is_proprietary: true
1001
+ is_sentence_transformers_compatible: false
1002
+ voyage-lite-02-instruct:
1003
+ link: https://docs.voyageai.com/embeddings/
1004
+ seq_len: 4000
1005
+ size: 1220
1006
+ dim: 1024
1007
+ is_external: true
1008
+ is_proprietary: true
1009
+ is_sentence_transformers_compatible: false
1010
+ xlm-roberta-base:
1011
+ link: https://huggingface.co/xlm-roberta-base
1012
+ seq_len: 514
1013
+ size: 279
1014
+ dim: 768
1015
+ is_external: true
1016
+ is_proprietary: false
1017
+ is_sentence_transformers_compatible: true
1018
+ xlm-roberta-large:
1019
+ link: https://huggingface.co/xlm-roberta-large
1020
+ seq_len: 514
1021
+ size: 560
1022
+ dim: 1024
1023
+ is_external: true
1024
+ is_proprietary: false
1025
+ is_sentence_transformers_compatible: true
1026
+ models_to_skip:
1027
+ - michaelfeil/ct2fast-e5-large-v2
1028
+ - McGill-NLP/LLM2Vec-Sheared-LLaMA-mntp-unsup-simcse
1029
+ - newsrx/instructor-xl
1030
+ - sionic-ai/sionic-ai-v1
1031
+ - lsf1000/bge-evaluation
1032
+ - Intel/bge-small-en-v1.5-sst2
1033
+ - newsrx/instructor-xl-newsrx
1034
+ - McGill-NLP/LLM2Vec-Mistral-7B-Instruct-v2-mntp-unsup-simcse
1035
+ - davidpeer/gte-small
1036
+ - goldenrooster/multilingual-e5-large
1037
+ - kozistr/fused-large-en
1038
+ - mixamrepijey/instructor-small
1039
+ - McGill-NLP/LLM2Vec-Llama-2-7b-chat-hf-mntp-supervised
1040
+ - DecisionOptimizationSystem/DeepFeatEmbeddingLargeContext
1041
+ - Intel/bge-base-en-v1.5-sst2-int8-dynamic
1042
+ - morgendigital/multilingual-e5-large-quantized
1043
+ - BAAI/bge-small-en
1044
+ - ggrn/e5-small-v2
1045
+ - vectoriseai/gte-small
1046
+ - giulio98/placeholder
1047
+ - odunola/UAE-Large-VI
1048
+ - vectoriseai/e5-large-v2
1049
+ - gruber/e5-small-v2-ggml
1050
+ - Severian/nomic
1051
+ - arcdev/e5-mistral-7b-instruct
1052
+ - mlx-community/multilingual-e5-base-mlx
1053
+ - michaelfeil/ct2fast-bge-base-en-v1.5
1054
+ - Intel/bge-small-en-v1.5-sst2-int8-static
1055
+ - jncraton/stella-base-en-v2-ct2-int8
1056
+ - vectoriseai/multilingual-e5-large
1057
+ - rlsChapters/Chapters-SFR-Embedding-Mistral
1058
+ - arcdev/SFR-Embedding-Mistral
1059
+ - McGill-NLP/LLM2Vec-Mistral-7B-Instruct-v2-mntp-supervised
1060
+ - vectoriseai/gte-base
1061
+ - mixamrepijey/instructor-models
1062
+ - GovCompete/e5-large-v2
1063
+ - ef-zulla/e5-multi-sml-torch
1064
+ - khoa-klaytn/bge-small-en-v1.5-angle
1065
+ - krilecy/e5-mistral-7b-instruct
1066
+ - vectoriseai/bge-base-en-v1.5
1067
+ - vectoriseai/instructor-base
1068
+ - jingyeom/korean_embedding_model
1069
+ - rizki/bgr-tf
1070
+ - barisaydin/bge-base-en
1071
+ - jamesgpt1/zzz
1072
+ - Malmuk1/e5-large-v2_Sharded
1073
+ - vectoriseai/ember-v1
1074
+ - Consensus/instructor-base
1075
+ - barisaydin/bge-small-en
1076
+ - barisaydin/gte-base
1077
+ - woody72/multilingual-e5-base
1078
+ - Einas/einas_ashkar
1079
+ - michaelfeil/ct2fast-bge-large-en-v1.5
1080
+ - vectoriseai/bge-small-en-v1.5
1081
+ - iampanda/Test
1082
+ - cherubhao/yogamodel
1083
+ - ieasybooks/multilingual-e5-large-onnx
1084
+ - jncraton/e5-small-v2-ct2-int8
1085
+ - radames/e5-large
1086
+ - khoa-klaytn/bge-base-en-v1.5-angle
1087
+ - Intel/bge-base-en-v1.5-sst2-int8-static
1088
+ - vectoriseai/e5-large
1089
+ - TitanML/jina-v2-base-en-embed
1090
+ - Koat/gte-tiny
1091
+ - binqiangliu/EmbeddingModlebgelargeENv1.5
1092
+ - beademiguelperez/sentence-transformers-multilingual-e5-small
1093
+ - sionic-ai/sionic-ai-v2
1094
+ - jamesdborin/jina-v2-base-en-embed
1095
+ - maiyad/multilingual-e5-small
1096
+ - dmlls/all-mpnet-base-v2
1097
+ - odunola/e5-base-v2
1098
+ - vectoriseai/bge-large-en-v1.5
1099
+ - vectoriseai/bge-small-en
1100
+ - karrar-alwaili/UAE-Large-V1
1101
+ - t12e/instructor-base
1102
+ - Frazic/udever-bloom-3b-sentence
1103
+ - Geolumina/instructor-xl
1104
+ - hsikchi/dump
1105
+ - recipe/embeddings
1106
+ - michaelfeil/ct2fast-bge-small-en-v1.5
1107
+ - ildodeltaRule/multilingual-e5-large
1108
+ - shubham-bgi/UAE-Large
1109
+ - BAAI/bge-large-en
1110
+ - michaelfeil/ct2fast-e5-small-v2
1111
+ - cgldo/semanticClone
1112
+ - barisaydin/gte-small
1113
+ - aident-ai/bge-base-en-onnx
1114
+ - jamesgpt1/english-large-v1
1115
+ - michaelfeil/ct2fast-e5-small
1116
+ - baseplate/instructor-large-1
1117
+ - newsrx/instructor-large
1118
+ - Narsil/bge-base-en
1119
+ - michaelfeil/ct2fast-e5-large
1120
+ - mlx-community/multilingual-e5-small-mlx
1121
+ - lightbird-ai/nomic
1122
+ - MaziyarPanahi/GritLM-8x7B-GGUF
1123
+ - newsrx/instructor-large-newsrx
1124
+ - dhairya0907/thenlper-get-large
1125
+ - barisaydin/bge-large-en
1126
+ - jncraton/bge-small-en-ct2-int8
1127
+ - retrainai/instructor-xl
1128
+ - BAAI/bge-base-en
1129
+ - gentlebowl/instructor-large-safetensors
1130
+ - d0rj/e5-large-en-ru
1131
+ - atian-chapters/Chapters-SFR-Embedding-Mistral
1132
+ - Intel/bge-base-en-v1.5-sts-int8-static
1133
+ - Intel/bge-base-en-v1.5-sts-int8-dynamic
1134
+ - jncraton/GIST-small-Embedding-v0-ct2-int8
1135
+ - jncraton/gte-tiny-ct2-int8
1136
+ - d0rj/e5-small-en-ru
1137
+ - vectoriseai/e5-small-v2
1138
+ - SmartComponents/bge-micro-v2
1139
+ - michaelfeil/ct2fast-gte-base
1140
+ - vectoriseai/e5-base-v2
1141
+ - Intel/bge-base-en-v1.5-sst2
1142
+ - McGill-NLP/LLM2Vec-Sheared-LLaMA-mntp-supervised
1143
+ - Research2NLP/electrical_stella
1144
+ - weakit-v/bge-base-en-v1.5-onnx
1145
+ - GovCompete/instructor-xl
1146
+ - barisaydin/text2vec-base-multilingual
1147
+ - Intel/bge-small-en-v1.5-sst2-int8-dynamic
1148
+ - jncraton/gte-small-ct2-int8
1149
+ - d0rj/e5-base-en-ru
1150
+ - barisaydin/gte-large
1151
+ - fresha/e5-large-v2-endpoint
1152
+ - vectoriseai/instructor-large
1153
+ - Severian/embed
1154
+ - vectoriseai/e5-base
1155
+ - mlx-community/multilingual-e5-large-mlx
1156
+ - vectoriseai/gte-large
1157
+ - anttip/ct2fast-e5-small-v2-hfie
1158
+ - michaelfeil/ct2fast-gte-large
1159
+ - gizmo-ai/Cohere-embed-multilingual-v3.0
1160
+ - McGill-NLP/LLM2Vec-Llama-2-7b-chat-hf-mntp-unsup-simcse