Edit model card

This is a MicroBERT model for Ancient Greek.

  • Its suffix is -m, which means that it was pretrained using supervision from masked language modeling.
  • The unlabeled Ancient Greek data was taken from the Diorisis corpus, totaling 9,058,227 tokens.
  • The UD treebank UD_Ancient_Greek-PROEIL, v2.9, totaling 213,999 tokens, was used for labeled data.

Please see the repository and the paper for more details.

Downloads last month
6
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.