Edit model card

Model Description

This is Bengali Fake News detection model, version 1.0. This model was introduced in this paper. An original implementation is deployed in this huggingface space.

In the hosted API interface on the right, the meaning of labels are: LABEL_0 = Fake LABEL_1 = Authentic

Model type: deep learning classifier

Finetuned From Model : https://huggingface.co/bert-base-multilingual-cased

How to load this model using transformers (tested on 4.31.0-py3)

from transformers import BertTokenizer, AutoTokenizer
from transformers import BertForSequenceClassification, AdamW, BertConfig

tokenizer = AutoTokenizer.from_pretrained('armansakif/bengali-fake-news')

model = BertForSequenceClassification.from_pretrained(
    "armansakif/bengali-fake-news", # Use the 12-layer BERT model, with an uncased vocab.
    num_labels = 2, # The number of output labels--2 for binary classification.
                    # You can increase this for multi-class tasks.
    output_attentions = False, # Whether the model returns attentions weights.
    output_hidden_states = False, # Whether the model returns all hidden-states.
)

Citation

If you use this model, please cite the following paper: BibTeX:

@article{chowdhury2023tackling,
  title={Tackling Fake News in Bengali: Unraveling the Impact of Summarization vs. Augmentation on Pre-trained Language Models},
  author={Chowdhury, Arman Sakif and Shahariar, GM and Aziz, Ahammed Tarik and Alam, Syed Mohibul and Sheikh, Md Azad and Belal, Tanveer Ahmed},
  journal={arXiv preprint arXiv:2307.06979},
  year={2023}
}

APA:

Chowdhury, A. S., Shahariar, G. M., Aziz, A. T., Alam, S. M., Sheikh, M. A., & Belal, T. A. (2023). Tackling Fake News in Bengali: Unraveling the Impact of Summarization vs. Augmentation on Pre-trained Language Models. arXiv preprint arXiv:2307.06979.

Downloads last month
17
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Spaces using armansakif/bengali-fake-news 2