Edit model card

SetFit Polarity Model with sentence-transformers/all-mpnet-base-v2

This is a SetFit model that can be used for Aspect Based Sentiment Analysis (ABSA). This SetFit model uses sentence-transformers/all-mpnet-base-v2 as the Sentence Transformer embedding model. A LogisticRegression instance is used for classification. In particular, this model is in charge of classifying aspect polarities.

The model has been trained using an efficient few-shot learning technique that involves:

  1. Fine-tuning a Sentence Transformer with contrastive learning.
  2. Training a classification head with features from the fine-tuned Sentence Transformer.

This model was trained within the context of a larger system for ABSA, which looks like so:

  1. Use a spaCy model to select possible aspect span candidates.
  2. Use a SetFit model to filter these possible aspect span candidates.
  3. Use this SetFit model to classify the filtered aspect span candidates.

Model Details

Model Description

Model Sources

Model Labels

Label Examples
neutral
  • 'skip taking the cord with me because:I charge it at night and skip taking the cord with me because of the good battery life.'
  • 'The tech guy then said the:The tech guy then said the service center does not do 1-to-1 exchange and I have to direct my concern to the "sales" team, which is the retail shop which I bought my netbook from.'
  • 'all dark, power light steady, hard:\xa0One night I turned the freaking thing off after using it, the next day I turn it on, no GUI, screen all dark, power light steady, hard drive light steady and not flashing as it usually does.'
positive
  • 'of the good battery life.:I charge it at night and skip taking the cord with me because of the good battery life.'
  • 'is of high quality, has a:it is of high quality, has a killer GUI, is extremely stable, is highly expandable, is bundled with lots of very good applications, is easy to use, and is absolutely gorgeous.'
  • 'has a killer GUI, is extremely:it is of high quality, has a killer GUI, is extremely stable, is highly expandable, is bundled with lots of very good applications, is easy to use, and is absolutely gorgeous.'
negative
  • 'then said the service center does not do:The tech guy then said the service center does not do 1-to-1 exchange and I have to direct my concern to the "sales" team, which is the retail shop which I bought my netbook from.'
  • 'concern to the "sales" team, which is:The tech guy then said the service center does not do 1-to-1 exchange and I have to direct my concern to the "sales" team, which is the retail shop which I bought my netbook from.'
  • 'on, no GUI, screen all:\xa0One night I turned the freaking thing off after using it, the next day I turn it on, no GUI, screen all dark, power light steady, hard drive light steady and not flashing as it usually does.'
conflict
  • '-No backlit keyboard, but not:-No backlit keyboard, but not an issue for me.'
  • "to replace the battery once, but:I did have to replace the battery once, but that was only a couple months ago and it's been working perfect ever since."
  • 'Yes, they cost more, but:Yes, they cost more, but they more than make up for it in speed, construction quality, and longevity.'

Evaluation

Metrics

Label Accuracy
all 0.7788

Uses

Direct Use for Inference

First install the SetFit library:

pip install setfit

Then you can load this model and run inference.

from setfit import AbsaModel

# Download from the 🤗 Hub
model = AbsaModel.from_pretrained(
    "setfit-absa-aspect",
    "marcelomoreno26/all-mpnet-base-v2-absa-polarity2",
)
# Run inference
preds = model("The food was great, but the venue is just way too busy.")

Training Details

Training Set Metrics

Training set Min Median Max
Word count 3 24.3447 80
Label Training Sample Count
negative 235
neutral 127
positive 271

Training Hyperparameters

  • batch_size: (16, 2)
  • num_epochs: (1, 16)
  • max_steps: -1
  • sampling_strategy: oversampling
  • body_learning_rate: (2e-05, 1e-05)
  • head_learning_rate: 0.01
  • loss: CosineSimilarityLoss
  • distance_metric: cosine_distance
  • margin: 0.25
  • end_to_end: False
  • use_amp: False
  • warmup_proportion: 0.1
  • seed: 42
  • eval_max_steps: -1
  • load_best_model_at_end: False

Training Results

Epoch Step Training Loss Validation Loss
0.3333 1 0.3749 -
0.0030 50 0.3097 -
0.0059 100 0.2214 -
0.0089 150 0.2125 -
0.0119 200 0.3202 -
0.0148 250 0.1878 -
0.0178 300 0.1208 -
0.0208 350 0.2414 -
0.0237 400 0.1961 -
0.0267 450 0.0607 -
0.0296 500 0.1103 -
0.0326 550 0.1213 -
0.0356 600 0.0972 -
0.0385 650 0.0124 -
0.0415 700 0.0151 -
0.0445 750 0.1517 -
0.0474 800 0.004 -
0.0504 850 0.0204 -
0.0534 900 0.0541 -
0.0563 950 0.003 -
0.0593 1000 0.0008 -
0.0623 1050 0.0703 -
0.0652 1100 0.0013 -
0.0682 1150 0.0007 -
0.0712 1200 0.0009 -
0.0741 1250 0.0004 -
0.0771 1300 0.0004 -
0.0801 1350 0.0005 -
0.0830 1400 0.0006 -
0.0860 1450 0.0004 -
0.0889 1500 0.0002 -
0.0919 1550 0.0002 -
0.0949 1600 0.0001 -
0.0978 1650 0.0006 -
0.1008 1700 0.0002 -
0.1038 1750 0.0012 -
0.1067 1800 0.0008 -
0.1097 1850 0.0048 -
0.1127 1900 0.0007 -
0.1156 1950 0.0001 -
0.1186 2000 0.0001 -
0.1216 2050 0.0001 -
0.1245 2100 0.0001 -
0.1275 2150 0.0001 -
0.1305 2200 0.0001 -
0.1334 2250 0.0 -
0.1364 2300 0.0001 -
0.1394 2350 0.0002 -
0.1423 2400 0.0 -
0.1453 2450 0.0 -
0.1482 2500 0.0589 -
0.1512 2550 0.0036 -
0.1542 2600 0.0013 -
0.1571 2650 0.0 -
0.1601 2700 0.0001 -
0.1631 2750 0.0004 -
0.1660 2800 0.0 -
0.1690 2850 0.0002 -
0.1720 2900 0.0096 -
0.1749 2950 0.0 -
0.1779 3000 0.0 -
0.1809 3050 0.0001 -
0.1838 3100 0.0 -
0.1868 3150 0.0001 -
0.1898 3200 0.0001 -
0.1927 3250 0.0 -
0.1957 3300 0.0 -
0.1986 3350 0.0001 -
0.2016 3400 0.0 -
0.2046 3450 0.0002 -
0.2075 3500 0.0 -
0.2105 3550 0.0 -
0.2135 3600 0.0001 -
0.2164 3650 0.0 -
0.2194 3700 0.0 -
0.2224 3750 0.0001 -
0.2253 3800 0.0 -
0.2283 3850 0.0 -
0.2313 3900 0.0 -
0.2342 3950 0.0 -
0.2372 4000 0.0 -
0.2402 4050 0.0 -
0.2431 4100 0.0 -
0.2461 4150 0.0 -
0.2491 4200 0.0 -
0.2520 4250 0.0 -
0.2550 4300 0.0 -
0.2579 4350 0.0 -
0.2609 4400 0.0 -
0.2639 4450 0.0 -
0.2668 4500 0.0 -
0.2698 4550 0.0 -
0.2728 4600 0.0 -
0.2757 4650 0.0 -
0.2787 4700 0.0 -
0.2817 4750 0.0 -
0.2846 4800 0.0 -
0.2876 4850 0.0001 -
0.2906 4900 0.0071 -
0.2935 4950 0.1151 -
0.2965 5000 0.0055 -
0.2995 5050 0.0005 -
0.3024 5100 0.0041 -
0.3054 5150 0.0001 -
0.3083 5200 0.0003 -
0.3113 5250 0.0001 -
0.3143 5300 0.0 -
0.3172 5350 0.0001 -
0.3202 5400 0.0 -
0.3232 5450 0.0 -
0.3261 5500 0.0 -
0.3291 5550 0.0 -
0.3321 5600 0.0 -
0.3350 5650 0.0 -
0.3380 5700 0.0 -
0.3410 5750 0.0 -
0.3439 5800 0.0 -
0.3469 5850 0.0 -
0.3499 5900 0.0 -
0.3528 5950 0.0 -
0.3558 6000 0.0 -
0.3588 6050 0.0 -
0.3617 6100 0.0 -
0.3647 6150 0.0 -
0.3676 6200 0.0 -
0.3706 6250 0.0 -
0.3736 6300 0.0 -
0.3765 6350 0.0 -
0.3795 6400 0.0 -
0.3825 6450 0.0 -
0.3854 6500 0.0 -
0.3884 6550 0.0 -
0.3914 6600 0.0 -
0.3943 6650 0.0 -
0.3973 6700 0.0 -
0.4003 6750 0.0 -
0.4032 6800 0.0 -
0.4062 6850 0.0 -
0.4092 6900 0.0 -
0.4121 6950 0.0 -
0.4151 7000 0.0 -
0.4181 7050 0.0 -
0.4210 7100 0.0 -
0.4240 7150 0.0 -
0.4269 7200 0.0 -
0.4299 7250 0.0 -
0.4329 7300 0.0 -
0.4358 7350 0.0 -
0.4388 7400 0.0 -
0.4418 7450 0.0 -
0.4447 7500 0.0 -
0.4477 7550 0.0 -
0.4507 7600 0.0 -
0.4536 7650 0.0003 -
0.4566 7700 0.0 -
0.4596 7750 0.0 -
0.4625 7800 0.0 -
0.4655 7850 0.0 -
0.4685 7900 0.0 -
0.4714 7950 0.0 -
0.4744 8000 0.0 -
0.4773 8050 0.0 -
0.4803 8100 0.0 -
0.4833 8150 0.0 -
0.4862 8200 0.0 -
0.4892 8250 0.0 -
0.4922 8300 0.0 -
0.4951 8350 0.0 -
0.4981 8400 0.0 -
0.5011 8450 0.0 -
0.5040 8500 0.0 -
0.5070 8550 0.0 -
0.5100 8600 0.0 -
0.5129 8650 0.0 -
0.5159 8700 0.0 -
0.5189 8750 0.0 -
0.5218 8800 0.0 -
0.5248 8850 0.0 -
0.5278 8900 0.0 -
0.5307 8950 0.0 -
0.5337 9000 0.0 -
0.5366 9050 0.0 -
0.5396 9100 0.0 -
0.5426 9150 0.0 -
0.5455 9200 0.0 -
0.5485 9250 0.0 -
0.5515 9300 0.0 -
0.5544 9350 0.0 -
0.5574 9400 0.0 -
0.5604 9450 0.0 -
0.5633 9500 0.0 -
0.5663 9550 0.0 -
0.5693 9600 0.0 -
0.5722 9650 0.0 -
0.5752 9700 0.0 -
0.5782 9750 0.0 -
0.5811 9800 0.0 -
0.5841 9850 0.0 -
0.5870 9900 0.0 -
0.5900 9950 0.0 -
0.5930 10000 0.0 -
0.5959 10050 0.0 -
0.5989 10100 0.0 -
0.6019 10150 0.0 -
0.6048 10200 0.0 -
0.6078 10250 0.0 -
0.6108 10300 0.0 -
0.6137 10350 0.0 -
0.6167 10400 0.0 -
0.6197 10450 0.0 -
0.6226 10500 0.0 -
0.6256 10550 0.0 -
0.6286 10600 0.0 -
0.6315 10650 0.0 -
0.6345 10700 0.0 -
0.6375 10750 0.0 -
0.6404 10800 0.0 -
0.6434 10850 0.0 -
0.6463 10900 0.0 -
0.6493 10950 0.0 -
0.6523 11000 0.0 -
0.6552 11050 0.0 -
0.6582 11100 0.0 -
0.6612 11150 0.0 -
0.6641 11200 0.0 -
0.6671 11250 0.0 -
0.6701 11300 0.0 -
0.6730 11350 0.0 -
0.6760 11400 0.0 -
0.6790 11450 0.0 -
0.6819 11500 0.0 -
0.6849 11550 0.0 -
0.6879 11600 0.0 -
0.6908 11650 0.0 -
0.6938 11700 0.0 -
0.6968 11750 0.0 -
0.6997 11800 0.0 -
0.7027 11850 0.0 -
0.7056 11900 0.0 -
0.7086 11950 0.0 -
0.7116 12000 0.0 -
0.7145 12050 0.0 -
0.7175 12100 0.0 -
0.7205 12150 0.0 -
0.7234 12200 0.0 -
0.7264 12250 0.0 -
0.7294 12300 0.0 -
0.7323 12350 0.0 -
0.7353 12400 0.0 -
0.7383 12450 0.0 -
0.7412 12500 0.0 -
0.7442 12550 0.0 -
0.7472 12600 0.0 -
0.7501 12650 0.0 -
0.7531 12700 0.0 -
0.7560 12750 0.0 -
0.7590 12800 0.0 -
0.7620 12850 0.0 -
0.7649 12900 0.0 -
0.7679 12950 0.0 -
0.7709 13000 0.0 -
0.7738 13050 0.0 -
0.7768 13100 0.0 -
0.7798 13150 0.0 -
0.7827 13200 0.0 -
0.7857 13250 0.0 -
0.7887 13300 0.0 -
0.7916 13350 0.0 -
0.7946 13400 0.0 -
0.7976 13450 0.0 -
0.8005 13500 0.0 -
0.8035 13550 0.0 -
0.8065 13600 0.0 -
0.8094 13650 0.0 -
0.8124 13700 0.0 -
0.8153 13750 0.0 -
0.8183 13800 0.0 -
0.8213 13850 0.0 -
0.8242 13900 0.0 -
0.8272 13950 0.0 -
0.8302 14000 0.0 -
0.8331 14050 0.0 -
0.8361 14100 0.0 -
0.8391 14150 0.0 -
0.8420 14200 0.0 -
0.8450 14250 0.0 -
0.8480 14300 0.0 -
0.8509 14350 0.0 -
0.8539 14400 0.0 -
0.8569 14450 0.0 -
0.8598 14500 0.0 -
0.8628 14550 0.0 -
0.8657 14600 0.0 -
0.8687 14650 0.0 -
0.8717 14700 0.0 -
0.8746 14750 0.0 -
0.8776 14800 0.0 -
0.8806 14850 0.0 -
0.8835 14900 0.0 -
0.8865 14950 0.0 -
0.8895 15000 0.0 -
0.8924 15050 0.0 -
0.8954 15100 0.0 -
0.8984 15150 0.0 -
0.9013 15200 0.0 -
0.9043 15250 0.0 -
0.9073 15300 0.0 -
0.9102 15350 0.0 -
0.9132 15400 0.0 -
0.9162 15450 0.0 -
0.9191 15500 0.0 -
0.9221 15550 0.0 -
0.9250 15600 0.0 -
0.9280 15650 0.0 -
0.9310 15700 0.0 -
0.9339 15750 0.0 -
0.9369 15800 0.0 -
0.9399 15850 0.0 -
0.9428 15900 0.0 -
0.9458 15950 0.0 -
0.9488 16000 0.0 -
0.9517 16050 0.0 -
0.9547 16100 0.0 -
0.9577 16150 0.0 -
0.9606 16200 0.0 -
0.9636 16250 0.0 -
0.9666 16300 0.0 -
0.9695 16350 0.0 -
0.9725 16400 0.0 -
0.9755 16450 0.0 -
0.9784 16500 0.0 -
0.9814 16550 0.0 -
0.9843 16600 0.0 -
0.9873 16650 0.0 -
0.9903 16700 0.0 -
0.9932 16750 0.0 -
0.9962 16800 0.0 -
0.9992 16850 0.0 -

Framework Versions

  • Python: 3.10.12
  • SetFit: 1.0.3
  • Sentence Transformers: 2.7.0
  • spaCy: 3.7.4
  • Transformers: 4.40.1
  • PyTorch: 2.2.1+cu121
  • Datasets: 2.19.0
  • Tokenizers: 0.19.1

Citation

BibTeX

@article{https://doi.org/10.48550/arxiv.2209.11055,
    doi = {10.48550/ARXIV.2209.11055},
    url = {https://arxiv.org/abs/2209.11055},
    author = {Tunstall, Lewis and Reimers, Nils and Jo, Unso Eun Seo and Bates, Luke and Korat, Daniel and Wasserblat, Moshe and Pereg, Oren},
    keywords = {Computation and Language (cs.CL), FOS: Computer and information sciences, FOS: Computer and information sciences},
    title = {Efficient Few-Shot Learning Without Prompts},
    publisher = {arXiv},
    year = {2022},
    copyright = {Creative Commons Attribution 4.0 International}
}
Downloads last month
23
Safetensors
Model size
109M params
Tensor type
F32
·
Inference Examples
Inference API (serverless) has been turned off for this model.

Model tree for marcelomoreno26/all-mpnet-base-v2-absa-polarity

Finetuned
(165)
this model

Evaluation results