Edit model card

GrεTa

The paper Exploring Language Models for Classical Philology is the first effort to systematically provide state-of-the-art language models for Classical Philology. GrεTa is a T5-base sized, monolingual, encoder-decoder variant.

This model was trained in two stages. Initially, it was pre-trained on a recently acquired corpus that leverages OCR scans obtained from the Internet Archive. Subsequently, the model was further trained using data from the Open Greek & Latin Project, the CLARIN corpus Greek Medieval Texts, and the Patrologia Graeca.

Further information can be found in our paper or in our GitHub repository.

Usage

from transformers import AutoTokenizer, AutoModelForConditionalGeneration

tokenizer = AutoTokenizer.from_pretrained('bowphs/GreTa')
model = AutoModelForConditionalGeneration.from_pretrained('bowphs/GreTa')

Please check out the awesome Hugging Face tutorials on how to fine-tune our models.

Evaluation Results

When fine-tuned on data from Universal Dependencies 2.10, GrεTa achieves the following results on the Ancient Greek Perseus dataset:

Task XPoS UPoS UAS LAS Lemma
94.44 89.03 87.32 83.06 91.14

Please note that the PoS tagging and dependency parsing results are obtained using only the encoder component of the model.

Contact

If you have any questions or problems, feel free to reach out.

Citation

@incollection{riemenschneiderfrank:2023,
    address = "Toronto, Canada",
    author = "Riemenschneider, Frederick and Frank, Anette",
    booktitle = "Proceedings of the 61st Annual Meeting of the Association for Computational Linguistics (ACL’23)",
    note = "to appear",
    pubType = "incollection",
    publisher = "Association for Computational Linguistics",
    title = "Exploring Large Language Models for Classical Philology",
    url = "https://arxiv.org/abs/2305.13698",
    year = "2023",
    key = "riemenschneiderfrank:2023"
}
Downloads last month
679
Safetensors
Model size
248M params
Tensor type
F32
·
Inference Examples
Inference API (serverless) has been turned off for this model.