metadata
license: cc-by-4.0
language:
- mr
MahaBERT
MahaBERT is a Marathi BERT model. It is a monolingual BERT (l3cube-pune/marathi-bert-v2) model fine-tuned on L3Cube-MahaNews [dataset link] (https://github.com/l3cube-pune/MarathiNLP/tree/main/L3Cube-MahaNews) More details on the dataset, models, and baseline results can be found in our [ paper ]
Citing:
@article{mirashi2024l3cube,
title={L3Cube-IndicNews: News-based Short Text and Long Document Classification Datasets in Indic Languages},
author={Mirashi, Aishwarya and Sonavane, Srushti and Lingayat, Purva and Padhiyar, Tejas and Joshi, Raviraj},
journal={arXiv preprint arXiv:2401.02254},
year={2024}
}
Other Monolingual Indic BERT models are listed below:
Marathi BERT
Marathi RoBERTa
Marathi AlBERT
Hindi BERT
Hindi RoBERTa
Hindi AlBERT
Dev BERT
Dev RoBERTa
Dev AlBERT
Kannada BERT
Telugu BERT
Malayalam BERT
Tamil BERT
Gujarati BERT
Oriya BERT
Bengali BERT
Punjabi BERT
Assamese BERT