BART GEC
Collection
2 items
•
Updated
This is a reproduction of the following paper:
@inproceedings{katsumata-komachi-2020-stronger,
title = "Stronger Baselines for Grammatical Error Correction Using a Pretrained Encoder-Decoder Model",
author = "Katsumata, Satoru and
Komachi, Mamoru",
booktitle = "Proceedings of the 1st Conference of the Asia-Pacific Chapter of the Association for Computational Linguistics and the 10th International Joint Conference on Natural Language Processing",
month = dec,
year = "2020",
address = "Suzhou, China",
publisher = "Association for Computational Linguistics",
url = "https://aclanthology.org/2020.aacl-main.83",
pages = "827--832",
}
This model achieves the following results:
Data | Metric | gotutiyan/gec-bart-large | Paper (bart-large) |
---|---|---|---|
CoNLL-2014 | M2 (P/R/F0.5) | 71.01 / 43.3 / 62.9 | 69.3 / 45.0 /62.6 |
BEA19-test | ERRANT (P/R/F0.5)3 | 70.4 / 55.0 / 66.6 | 68.3 / 57.1 /65.6 |
JFLEG-test | GLEU | 57.8 | 57.3 |
The details can be found in the GitHub repository.