Automatic correction of README.md metadata. Contact [email protected] for any question
6e5aef0
metadata
thumbnail: >-
https://github.com/SauravMaheshkar/CommonLit-Readibility/blob/main/assets/CommonLit%20-%20Big%20Banner.png?raw=true
tags:
- kaggle
license: cc0-1.0
datasets:
- Commonlit-Readibility
metrics:
- Perplexity
PreTraining
Architecture | Weights | PreTraining Loss | PreTraining Perplexity |
---|---|---|---|
roberta-base | huggingface/hub | 0.3488 | 3.992 |
bert-base-uncased | huggingface/hub | 0.3909 | 6.122 |
electra-large | huggingface/hub | 0.723 | 6.394 |
albert-base | huggingface/hub | 0.7343 | 7.76 |
electra-small | huggingface/hub | 0.9226 | 11.098 |
electra-base | huggingface/hub | 0.9468 | 8.783 |
distilbert-base-uncased | huggingface/hub | 1.082 | 7.963 |