Update README.md
Browse files
README.md
CHANGED
@@ -46,6 +46,12 @@ Specifically, this model is a *bert-base-cased* model that was fine-tuned on the
|
|
46 |
|
47 |
If you'd like to use a larger BERT-large model fine-tuned on the same dataset, a [**bert-large-NER**](https://huggingface.co/dslim/bert-large-NER/) version is also available.
|
48 |
|
|
|
|
|
|
|
|
|
|
|
|
|
49 |
|
50 |
## Intended uses & limitations
|
51 |
|
|
|
46 |
|
47 |
If you'd like to use a larger BERT-large model fine-tuned on the same dataset, a [**bert-large-NER**](https://huggingface.co/dslim/bert-large-NER/) version is also available.
|
48 |
|
49 |
+
### Available NER models
|
50 |
+
| Model Name | Description | Parameters |
|
51 |
+
|-------------------|-------------|------------------|
|
52 |
+
| [bert-large-NER](https://huggingface.co/dslim/bert-large-NER/) | Fine-tuned bert-large-cased - larger model with slightly better performance | 340M |
|
53 |
+
| [bert-base-NER](https://huggingface.co/dslim/bert-base-NER)-([uncased](https://huggingface.co/dslim/bert-base-NER-uncased)) | Fine-tuned bert-base, available in both cased and uncased versions | 110M |
|
54 |
+
| [distillbert-NER](https://huggingface.co/dslim/distillbert-NER) | Fine-tuned DistilBERT - a smaller, faster, lighter version of BERT | 66M |
|
55 |
|
56 |
## Intended uses & limitations
|
57 |
|