--- base_model: sentence-transformers/paraphrase-MiniLM-L3-v2 library_name: setfit metrics: - accuracy pipeline_tag: text-classification tags: - setfit - sentence-transformers - text-classification - generated_from_setfit_trainer widget: - text: 'metrics.statistics.polysyllables: 4603, 102, 339, 2604, 397, 1180, 555, 1488, 1226, 378, 6639, 1978, 4088, 7005, 3256, 86, 2338, 1905, 1647, 16369' - text: 'company.relationship: founder, None, founder/chairman, Relation, relation, CEO, chairman, investor, founder and CEO, founder/CEO, owner, chairman of management committee, founder and chairman, Chairman and Chief Executive Officer, general director, executive chairman, Chairman/founder, founder, chairman, ceo, former chairman and CEO, relation and chairman' - text: 'variety: Western, Eastern' - text: 'Data.Fat.Saturated Fat: 2.009, 1.164, 1.86, 2.154, 0.568, 0.117, 1.11, 0.049, 0.66, 1.242, 1.899, 0.596, 2.667, 0.044, 2.554, 0.633, 4.591, 1.214, 0.121, 5.486' - text: 'Date.Full: 8/26/1990, 3/24/1991, 3/31/1991, 4/7/1991, 4/14/1991, 4/21/1991, 4/28/1991, 5/5/1991, 5/12/1991, 5/19/1991, 5/26/1991, 6/2/1991, 6/9/1991, 6/16/1991, 6/23/1991, 6/30/1991, 7/7/1991, 7/14/1991, 7/21/1991, 7/28/1991' inference: true model-index: - name: SetFit with sentence-transformers/paraphrase-MiniLM-L3-v2 results: - task: type: text-classification name: Text Classification dataset: name: Unknown type: unknown split: test metrics: - type: accuracy value: 0.7512388503468781 name: Accuracy --- # SetFit with sentence-transformers/paraphrase-MiniLM-L3-v2 This is a [SetFit](https://github.com/huggingface/setfit) model that can be used for Text Classification. This SetFit model uses [sentence-transformers/paraphrase-MiniLM-L3-v2](https://huggingface.co/sentence-transformers/paraphrase-MiniLM-L3-v2) as the Sentence Transformer embedding model. A [LogisticRegression](https://scikit-learn.org/stable/modules/generated/sklearn.linear_model.LogisticRegression.html) instance is used for classification. The model has been trained using an efficient few-shot learning technique that involves: 1. Fine-tuning a [Sentence Transformer](https://www.sbert.net) with contrastive learning. 2. Training a classification head with features from the fine-tuned Sentence Transformer. ## Model Details ### Model Description - **Model Type:** SetFit - **Sentence Transformer body:** [sentence-transformers/paraphrase-MiniLM-L3-v2](https://huggingface.co/sentence-transformers/paraphrase-MiniLM-L3-v2) - **Classification head:** a [LogisticRegression](https://scikit-learn.org/stable/modules/generated/sklearn.linear_model.LogisticRegression.html) instance - **Maximum Sequence Length:** 128 tokens - **Number of Classes:** 39 classes ### Model Sources - **Repository:** [SetFit on GitHub](https://github.com/huggingface/setfit) - **Paper:** [Efficient Few-Shot Learning Without Prompts](https://arxiv.org/abs/2209.11055) - **Blogpost:** [SetFit: Efficient Few-Shot Learning Without Prompts](https://huggingface.co/blog/setfit) ### Model Labels | Label | Examples | |:------------------------|:-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------| | Month Number | | | Date | | | Categorical | | | Year | | | Longitude | | | Floating Point Number | | | Slug | | | U.S. State Abbreviation | | | Month Name | | | Day of Month | | | Currency Code | | | Last Name | | | Timestamp | | | Day of Week | | | Integer | | | Street Address | | | URL | | | U.S. State | | | Zip Code | | | Country Name | | | Boolean | | | Short text | | | Occupation | | | Partial timestamp | | | Street Name | | | Full Name | | | Very short text | | | URI | | | Latitude | | | Time | | | Postal Code | | | Country ISO Code | | | First Name | | | City Name | | | Color | | | License Plate | | | AM/PM | | | Company Name | | | Secondary Address | | ## Evaluation ### Metrics | Label | Accuracy | |:--------|:---------| | **all** | 0.7512 | ## Uses ### Direct Use for Inference First install the SetFit library: ```bash pip install setfit ``` Then you can load this model and run inference. ```python from setfit import SetFitModel # Download from the 🤗 Hub model = SetFitModel.from_pretrained("quantisan/paraphrase-MiniLM-L3-v2-93dataset") # Run inference preds = model("variety: Western, Eastern") ``` ## Training Details ### Training Set Metrics | Training set | Min | Median | Max | |:-------------|:----|:--------|:----| | Word count | 2 | 22.3314 | 85 | | Label | Training Sample Count | |:------------------------|:----------------------| | Categorical | 8 | | Timestamp | 5 | | Date | 8 | | Integer | 8 | | Partial timestamp | 4 | | Short text | 8 | | Very short text | 3 | | AM/PM | 1 | | Boolean | 8 | | City Name | 1 | | Color | 3 | | Company Name | 1 | | Country ISO Code | 2 | | Country Name | 8 | | Currency Code | 1 | | Day of Month | 4 | | Day of Week | 4 | | First Name | 1 | | Floating Point Number | 8 | | Full Name | 8 | | Last Name | 2 | | Latitude | 4 | | License Plate | 1 | | Longitude | 4 | | Month Name | 6 | | Month Number | 4 | | Occupation | 3 | | Postal Code | 1 | | Secondary Address | 1 | | Slug | 8 | | Street Address | 3 | | Street Name | 3 | | Time | 3 | | U.S. State | 8 | | U.S. State Abbreviation | 6 | | URI | 1 | | URL | 8 | | Year | 8 | | Zip Code | 4 | ### Training Hyperparameters - batch_size: (8, 8) - num_epochs: (4, 4) - max_steps: -1 - sampling_strategy: oversampling - body_learning_rate: (2e-05, 1e-05) - head_learning_rate: 0.01 - loss: CosineSimilarityLoss - distance_metric: cosine_distance - margin: 0.25 - end_to_end: False - use_amp: False - warmup_proportion: 0.1 - l2_weight: 0.01 - seed: 42 - eval_max_steps: -1 - load_best_model_at_end: True ### Training Results | Epoch | Step | Training Loss | Validation Loss | |:------:|:-----:|:-------------:|:---------------:| | 0.0003 | 1 | 0.3882 | - | | 0.0140 | 50 | 0.1864 | - | | 0.0280 | 100 | 0.1588 | - | | 0.0421 | 150 | 0.15 | - | | 0.0561 | 200 | 0.1537 | - | | 0.0701 | 250 | 0.1325 | - | | 0.0841 | 300 | 0.132 | - | | 0.0981 | 350 | 0.1149 | - | | 0.1121 | 400 | 0.1198 | - | | 0.1262 | 450 | 0.1035 | - | | 0.1402 | 500 | 0.0907 | - | | 0.1542 | 550 | 0.0917 | - | | 0.1682 | 600 | 0.0875 | - | | 0.1822 | 650 | 0.0803 | - | | 0.1962 | 700 | 0.0669 | - | | 0.2103 | 750 | 0.0671 | - | | 0.2243 | 800 | 0.0614 | - | | 0.2383 | 850 | 0.0642 | - | | 0.2523 | 900 | 0.0481 | - | | 0.2663 | 950 | 0.0548 | - | | 0.2803 | 1000 | 0.0346 | - | | 0.2944 | 1050 | 0.0406 | - | | 0.3084 | 1100 | 0.0403 | - | | 0.3224 | 1150 | 0.0349 | - | | 0.3364 | 1200 | 0.0312 | - | | 0.3504 | 1250 | 0.0378 | - | | 0.3645 | 1300 | 0.0335 | - | | 0.3785 | 1350 | 0.0323 | - | | 0.3925 | 1400 | 0.0234 | - | | 0.4065 | 1450 | 0.0313 | - | | 0.4205 | 1500 | 0.022 | - | | 0.4345 | 1550 | 0.0326 | - | | 0.4486 | 1600 | 0.0233 | - | | 0.4626 | 1650 | 0.0195 | - | | 0.4766 | 1700 | 0.0254 | - | | 0.4906 | 1750 | 0.0211 | - | | 0.5046 | 1800 | 0.0198 | - | | 0.5186 | 1850 | 0.0201 | - | | 0.5327 | 1900 | 0.0216 | - | | 0.5467 | 1950 | 0.0174 | - | | 0.5607 | 2000 | 0.0176 | - | | 0.5747 | 2050 | 0.0234 | - | | 0.5887 | 2100 | 0.0172 | - | | 0.6027 | 2150 | 0.0129 | - | | 0.6168 | 2200 | 0.0151 | - | | 0.6308 | 2250 | 0.015 | - | | 0.6448 | 2300 | 0.0164 | - | | 0.6588 | 2350 | 0.0137 | - | | 0.6728 | 2400 | 0.014 | - | | 0.6869 | 2450 | 0.0154 | - | | 0.7009 | 2500 | 0.0135 | - | | 0.7149 | 2550 | 0.0164 | - | | 0.7289 | 2600 | 0.0139 | - | | 0.7429 | 2650 | 0.0164 | - | | 0.7569 | 2700 | 0.0106 | - | | 0.7710 | 2750 | 0.0084 | - | | 0.7850 | 2800 | 0.0133 | - | | 0.7990 | 2850 | 0.0114 | - | | 0.8130 | 2900 | 0.0066 | - | | 0.8270 | 2950 | 0.0091 | - | | 0.8410 | 3000 | 0.0126 | - | | 0.8551 | 3050 | 0.0107 | - | | 0.8691 | 3100 | 0.0068 | - | | 0.8831 | 3150 | 0.006 | - | | 0.8971 | 3200 | 0.007 | - | | 0.9111 | 3250 | 0.0155 | - | | 0.9251 | 3300 | 0.0111 | - | | 0.9392 | 3350 | 0.0049 | - | | 0.9532 | 3400 | 0.0076 | - | | 0.9672 | 3450 | 0.0092 | - | | 0.9812 | 3500 | 0.0086 | - | | 0.9952 | 3550 | 0.0061 | - | | 1.0 | 3567 | - | 0.1341 | | 1.0093 | 3600 | 0.0073 | - | | 1.0233 | 3650 | 0.0065 | - | | 1.0373 | 3700 | 0.0063 | - | | 1.0513 | 3750 | 0.0094 | - | | 1.0653 | 3800 | 0.0114 | - | | 1.0793 | 3850 | 0.0084 | - | | 1.0934 | 3900 | 0.0098 | - | | 1.1074 | 3950 | 0.0058 | - | | 1.1214 | 4000 | 0.0045 | - | | 1.1354 | 4050 | 0.018 | - | | 1.1494 | 4100 | 0.0077 | - | | 1.1634 | 4150 | 0.0067 | - | | 1.1775 | 4200 | 0.0061 | - | | 1.1915 | 4250 | 0.0037 | - | | 1.2055 | 4300 | 0.0045 | - | | 1.2195 | 4350 | 0.0033 | - | | 1.2335 | 4400 | 0.0067 | - | | 1.2475 | 4450 | 0.0054 | - | | 1.2616 | 4500 | 0.0057 | - | | 1.2756 | 4550 | 0.004 | - | | 1.2896 | 4600 | 0.0033 | - | | 1.3036 | 4650 | 0.0076 | - | | 1.3176 | 4700 | 0.0045 | - | | 1.3317 | 4750 | 0.0068 | - | | 1.3457 | 4800 | 0.0043 | - | | 1.3597 | 4850 | 0.0049 | - | | 1.3737 | 4900 | 0.0045 | - | | 1.3877 | 4950 | 0.0055 | - | | 1.4017 | 5000 | 0.0065 | - | | 1.4158 | 5050 | 0.0029 | - | | 1.4298 | 5100 | 0.0041 | - | | 1.4438 | 5150 | 0.0064 | - | | 1.4578 | 5200 | 0.0031 | - | | 1.4718 | 5250 | 0.0078 | - | | 1.4858 | 5300 | 0.0031 | - | | 1.4999 | 5350 | 0.004 | - | | 1.5139 | 5400 | 0.0035 | - | | 1.5279 | 5450 | 0.0062 | - | | 1.5419 | 5500 | 0.0062 | - | | 1.5559 | 5550 | 0.0065 | - | | 1.5699 | 5600 | 0.0036 | - | | 1.5840 | 5650 | 0.0037 | - | | 1.5980 | 5700 | 0.0047 | - | | 1.6120 | 5750 | 0.0037 | - | | 1.6260 | 5800 | 0.0028 | - | | 1.6400 | 5850 | 0.0052 | - | | 1.6541 | 5900 | 0.0043 | - | | 1.6681 | 5950 | 0.0029 | - | | 1.6821 | 6000 | 0.0064 | - | | 1.6961 | 6050 | 0.0031 | - | | 1.7101 | 6100 | 0.0023 | - | | 1.7241 | 6150 | 0.002 | - | | 1.7382 | 6200 | 0.0041 | - | | 1.7522 | 6250 | 0.0033 | - | | 1.7662 | 6300 | 0.0043 | - | | 1.7802 | 6350 | 0.0023 | - | | 1.7942 | 6400 | 0.0036 | - | | 1.8082 | 6450 | 0.0024 | - | | 1.8223 | 6500 | 0.0016 | - | | 1.8363 | 6550 | 0.003 | - | | 1.8503 | 6600 | 0.0043 | - | | 1.8643 | 6650 | 0.0043 | - | | 1.8783 | 6700 | 0.0017 | - | | 1.8923 | 6750 | 0.0018 | - | | 1.9064 | 6800 | 0.0029 | - | | 1.9204 | 6850 | 0.0026 | - | | 1.9344 | 6900 | 0.0039 | - | | 1.9484 | 6950 | 0.0019 | - | | 1.9624 | 7000 | 0.0041 | - | | 1.9765 | 7050 | 0.0019 | - | | 1.9905 | 7100 | 0.0023 | - | | 2.0 | 7134 | - | 0.1286 | | 2.0045 | 7150 | 0.0016 | - | | 2.0185 | 7200 | 0.0017 | - | | 2.0325 | 7250 | 0.0016 | - | | 2.0465 | 7300 | 0.0019 | - | | 2.0606 | 7350 | 0.0015 | - | | 2.0746 | 7400 | 0.0016 | - | | 2.0886 | 7450 | 0.0015 | - | | 2.1026 | 7500 | 0.0015 | - | | 2.1166 | 7550 | 0.0034 | - | | 2.1306 | 7600 | 0.0043 | - | | 2.1447 | 7650 | 0.0016 | - | | 2.1587 | 7700 | 0.0016 | - | | 2.1727 | 7750 | 0.0015 | - | | 2.1867 | 7800 | 0.0015 | - | | 2.2007 | 7850 | 0.0017 | - | | 2.2147 | 7900 | 0.0013 | - | | 2.2288 | 7950 | 0.0016 | - | | 2.2428 | 8000 | 0.0013 | - | | 2.2568 | 8050 | 0.0039 | - | | 2.2708 | 8100 | 0.0053 | - | | 2.2848 | 8150 | 0.0025 | - | | 2.2989 | 8200 | 0.0015 | - | | 2.3129 | 8250 | 0.0012 | - | | 2.3269 | 8300 | 0.006 | - | | 2.3409 | 8350 | 0.0014 | - | | 2.3549 | 8400 | 0.0014 | - | | 2.3689 | 8450 | 0.0028 | - | | 2.3830 | 8500 | 0.0015 | - | | 2.3970 | 8550 | 0.0019 | - | | 2.4110 | 8600 | 0.0017 | - | | 2.4250 | 8650 | 0.002 | - | | 2.4390 | 8700 | 0.0016 | - | | 2.4530 | 8750 | 0.0014 | - | | 2.4671 | 8800 | 0.0021 | - | | 2.4811 | 8850 | 0.0012 | - | | 2.4951 | 8900 | 0.0015 | - | | 2.5091 | 8950 | 0.0012 | - | | 2.5231 | 9000 | 0.0012 | - | | 2.5371 | 9050 | 0.0016 | - | | 2.5512 | 9100 | 0.0016 | - | | 2.5652 | 9150 | 0.0013 | - | | 2.5792 | 9200 | 0.0028 | - | | 2.5932 | 9250 | 0.0013 | - | | 2.6072 | 9300 | 0.0011 | - | | 2.6213 | 9350 | 0.0035 | - | | 2.6353 | 9400 | 0.0013 | - | | 2.6493 | 9450 | 0.0012 | - | | 2.6633 | 9500 | 0.0037 | - | | 2.6773 | 9550 | 0.0012 | - | | 2.6913 | 9600 | 0.0011 | - | | 2.7054 | 9650 | 0.0037 | - | | 2.7194 | 9700 | 0.0012 | - | | 2.7334 | 9750 | 0.0013 | - | | 2.7474 | 9800 | 0.0013 | - | | 2.7614 | 9850 | 0.001 | - | | 2.7754 | 9900 | 0.0011 | - | | 2.7895 | 9950 | 0.0012 | - | | 2.8035 | 10000 | 0.0012 | - | | 2.8175 | 10050 | 0.001 | - | | 2.8315 | 10100 | 0.001 | - | | 2.8455 | 10150 | 0.0011 | - | | 2.8595 | 10200 | 0.0009 | - | | 2.8736 | 10250 | 0.0018 | - | | 2.8876 | 10300 | 0.0013 | - | | 2.9016 | 10350 | 0.0009 | - | | 2.9156 | 10400 | 0.0033 | - | | 2.9296 | 10450 | 0.0034 | - | | 2.9437 | 10500 | 0.0011 | - | | 2.9577 | 10550 | 0.0013 | - | | 2.9717 | 10600 | 0.0009 | - | | 2.9857 | 10650 | 0.0009 | - | | 2.9997 | 10700 | 0.0011 | - | | 3.0 | 10701 | - | 0.1205 | | 3.0137 | 10750 | 0.0009 | - | | 3.0278 | 10800 | 0.0009 | - | | 3.0418 | 10850 | 0.0032 | - | | 3.0558 | 10900 | 0.0008 | - | | 3.0698 | 10950 | 0.0013 | - | | 3.0838 | 11000 | 0.0033 | - | | 3.0978 | 11050 | 0.0011 | - | | 3.1119 | 11100 | 0.0008 | - | | 3.1259 | 11150 | 0.0009 | - | | 3.1399 | 11200 | 0.0008 | - | | 3.1539 | 11250 | 0.0033 | - | | 3.1679 | 11300 | 0.0032 | - | | 3.1819 | 11350 | 0.0008 | - | | 3.1960 | 11400 | 0.0008 | - | | 3.2100 | 11450 | 0.001 | - | | 3.2240 | 11500 | 0.0009 | - | | 3.2380 | 11550 | 0.0008 | - | | 3.2520 | 11600 | 0.0008 | - | | 3.2660 | 11650 | 0.0008 | - | | 3.2801 | 11700 | 0.0009 | - | | 3.2941 | 11750 | 0.0008 | - | | 3.3081 | 11800 | 0.0007 | - | | 3.3221 | 11850 | 0.0008 | - | | 3.3361 | 11900 | 0.0008 | - | | 3.3502 | 11950 | 0.0009 | - | | 3.3642 | 12000 | 0.0008 | - | | 3.3782 | 12050 | 0.0007 | - | | 3.3922 | 12100 | 0.0009 | - | | 3.4062 | 12150 | 0.0008 | - | | 3.4202 | 12200 | 0.0008 | - | | 3.4343 | 12250 | 0.0009 | - | | 3.4483 | 12300 | 0.0008 | - | | 3.4623 | 12350 | 0.0008 | - | | 3.4763 | 12400 | 0.0008 | - | | 3.4903 | 12450 | 0.0009 | - | | 3.5043 | 12500 | 0.0007 | - | | 3.5184 | 12550 | 0.0008 | - | | 3.5324 | 12600 | 0.0009 | - | | 3.5464 | 12650 | 0.0031 | - | | 3.5604 | 12700 | 0.0009 | - | | 3.5744 | 12750 | 0.0008 | - | | 3.5884 | 12800 | 0.0007 | - | | 3.6025 | 12850 | 0.0007 | - | | 3.6165 | 12900 | 0.0007 | - | | 3.6305 | 12950 | 0.0008 | - | | 3.6445 | 13000 | 0.0007 | - | | 3.6585 | 13050 | 0.0008 | - | | 3.6726 | 13100 | 0.0007 | - | | 3.6866 | 13150 | 0.0007 | - | | 3.7006 | 13200 | 0.0008 | - | | 3.7146 | 13250 | 0.0007 | - | | 3.7286 | 13300 | 0.0031 | - | | 3.7426 | 13350 | 0.0006 | - | | 3.7567 | 13400 | 0.0008 | - | | 3.7707 | 13450 | 0.0007 | - | | 3.7847 | 13500 | 0.0006 | - | | 3.7987 | 13550 | 0.0007 | - | | 3.8127 | 13600 | 0.0008 | - | | 3.8267 | 13650 | 0.0007 | - | | 3.8408 | 13700 | 0.0008 | - | | 3.8548 | 13750 | 0.0007 | - | | 3.8688 | 13800 | 0.0007 | - | | 3.8828 | 13850 | 0.0007 | - | | 3.8968 | 13900 | 0.0007 | - | | 3.9108 | 13950 | 0.0007 | - | | 3.9249 | 14000 | 0.0031 | - | | 3.9389 | 14050 | 0.003 | - | | 3.9529 | 14100 | 0.0007 | - | | 3.9669 | 14150 | 0.0007 | - | | 3.9809 | 14200 | 0.0007 | - | | 3.9950 | 14250 | 0.0007 | - | | 4.0 | 14268 | - | 0.1155 | ### Framework Versions - Python: 3.11.10 - SetFit: 1.1.0 - Sentence Transformers: 3.2.0 - Transformers: 4.45.2 - PyTorch: 2.4.1+cu124 - Datasets: 3.0.1 - Tokenizers: 0.20.1 ## Citation ### BibTeX ```bibtex @article{https://doi.org/10.48550/arxiv.2209.11055, doi = {10.48550/ARXIV.2209.11055}, url = {https://arxiv.org/abs/2209.11055}, author = {Tunstall, Lewis and Reimers, Nils and Jo, Unso Eun Seo and Bates, Luke and Korat, Daniel and Wasserblat, Moshe and Pereg, Oren}, keywords = {Computation and Language (cs.CL), FOS: Computer and information sciences, FOS: Computer and information sciences}, title = {Efficient Few-Shot Learning Without Prompts}, publisher = {arXiv}, year = {2022}, copyright = {Creative Commons Attribution 4.0 International} } ```