Update README.md
Browse files
README.md
CHANGED
@@ -11,7 +11,7 @@ BERT is an auto-encoder transformer model pretrained on a large corpus of Englis
|
|
11 |
- Next sentence prediction (NSP)
|
12 |
|
13 |
## Fine-tuned Model Description: BERT fine-tuned Cola
|
14 |
-
The pretrained model could be fine-tuned on other NLP tasks. The BERT model has been fine-tuned on a cola dataset from the GLUE BENCHAMRK, which is an academic benchmark that aims to measure the performance of ML models. Cola is one of the 11 datasets in this
|
15 |
|
16 |
By fine-tuning BERT on cola dataset, the model is now able to classify a given setence gramatically and semantically as acceptable or not acceptable
|
17 |
|
|
|
11 |
- Next sentence prediction (NSP)
|
12 |
|
13 |
## Fine-tuned Model Description: BERT fine-tuned Cola
|
14 |
+
The pretrained model could be fine-tuned on other NLP tasks. The BERT model has been fine-tuned on a cola dataset from the GLUE BENCHAMRK, which is an academic benchmark that aims to measure the performance of ML models. Cola is one of the 11 datasets in this GLUE BENCHMARK.
|
15 |
|
16 |
By fine-tuning BERT on cola dataset, the model is now able to classify a given setence gramatically and semantically as acceptable or not acceptable
|
17 |
|