l3cube-pune
commited on
Commit
•
d2cc8ee
1
Parent(s):
a9ba07f
Create README.md
Browse files
README.md
ADDED
@@ -0,0 +1,25 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
---
|
2 |
+
license: cc-by-4.0
|
3 |
+
language: mr
|
4 |
+
datasets:
|
5 |
+
- L3Cube-MahaCorpus
|
6 |
+
---
|
7 |
+
|
8 |
+
## MahaBERT
|
9 |
+
MahaBERT is a Marathi BERT model. It is a multilingual BERT (bert-base-multilingual-cased) model fine-tuned on L3Cube-MahaCorpus and other publicly available Marathi monolingual datasets.
|
10 |
+
[dataset link] (https://github.com/l3cube-pune/MarathiNLP)
|
11 |
+
|
12 |
+
More details on the dataset, models, and baseline results can be found in our [paper] (https://arxiv.org/abs/2202.01159)
|
13 |
+
|
14 |
+
```
|
15 |
+
@InProceedings{joshi:2022:WILDRE6,
|
16 |
+
author = {Joshi, Raviraj},
|
17 |
+
title = {L3Cube-MahaCorpus and MahaBERT: Marathi Monolingual Corpus, Marathi BERT Language Models, and Resources},
|
18 |
+
booktitle = {Proceedings of The WILDRE-6 Workshop within the 13th Language Resources and Evaluation Conference},
|
19 |
+
month = {June},
|
20 |
+
year = {2022},
|
21 |
+
address = {Marseille, France},
|
22 |
+
publisher = {European Language Resources Association},
|
23 |
+
pages = {97--101}
|
24 |
+
}
|
25 |
+
```
|