Edit model card


    Task: Sentiment Analysis
    Model: BERT-TWEET
    Lang: IT
  

Model description

This is a BERT [1] uncased model for the Italian language, fine-tuned for Sentiment Analysis (positive and negative classes only) on the SENTIPOLC-16 dataset, using BERT-TWEET-ITALIAN (bert-tweet-base-italian-uncased) as a pre-trained model.

Training and Performances

The model is trained to perform binary sentiment classification (positive vs negative) and it's meant to be used primarily on tweets or other social media posts. It has been fine-tuned for Sentiment Analysis, using the SENTIPOLC-16 dataset, for 3 epochs with a constant learning rate of 1e-5 and exploiting class weighting to compensate for the class imbalance. Instances having both positive and negative sentiment have been excluded, resulting in 4154 training instances and 1050 test instances

The performances on the test set are reported in the following table:

Accuracy Recall Precision F1
83.67 83.15 80.48 81.49

The Recall, Precision and F1 metrics are averaged over the two classes

Quick usage

from transformers import BertTokenizerFast, BertForSequenceClassification

tokenizer = BertTokenizerFast.from_pretrained("osiria/bert-tweet-italian-uncased-sentiment")
model = BertForSequenceClassification.from_pretrained("osiria/bert-tweet-italian-uncased-sentiment")

from transformers import pipeline
classifier = pipeline("text-classification", model = model, tokenizer = tokenizer)

classifier("una fantastica giornata di #calcio! grande prestazione del mister e della squadra")

# [{'label': 'POSITIVE', 'score': 0.9883694648742676}]

References

[1] https://arxiv.org/abs/1810.04805

Limitations

This model was trained on tweets, so it's mainly suitable for general-purpose social media text processing, involving short texts written in a social network style. It might show limitations when it comes to longer and more structured text, or domain-specific text.

License

The model is released under Apache-2.0 license

Downloads last month
200
Safetensors
Model size
110M params
Tensor type
I64
·
F32
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.