--- license: apache-2.0 tags: - generated_from_trainer datasets: - conll2003 metrics: - precision - recall - f1 - accuracy model-index: - name: distilbert-base-uncased-finetuned-tokenclassification results: - task: name: Token Classification type: token-classification dataset: name: conll2003 type: conll2003 args: conll2003 metrics: - name: Precision type: precision value: 0.9239094422970734 - name: Recall type: recall value: 0.9358988701197002 - name: F1 type: f1 value: 0.9298655107257975 - name: Accuracy type: accuracy value: 0.9833669595056158 --- # distilbert-base-uncased-finetuned-tokenclassification This model is a fine-tuned version of [distilbert-base-uncased](https://huggingface.co/distilbert-base-uncased) on the conll2003 dataset. It achieves the following results on the evaluation set: - Loss: 0.0622 - Precision: 0.9239 - Recall: 0.9359 - F1: 0.9299 - Accuracy: 0.9834 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 2e-05 - train_batch_size: 16 - eval_batch_size: 16 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 3 ### Training results | Training Loss | Epoch | Step | Validation Loss | Precision | Recall | F1 | Accuracy | |:-------------:|:-----:|:----:|:---------------:|:---------:|:------:|:------:|:--------:| | 0.2459 | 1.0 | 878 | 0.0716 | 0.9191 | 0.9220 | 0.9206 | 0.9814 | | 0.0545 | 2.0 | 1756 | 0.0620 | 0.9239 | 0.9349 | 0.9294 | 0.9829 | | 0.0292 | 3.0 | 2634 | 0.0622 | 0.9239 | 0.9359 | 0.9299 | 0.9834 | ### Framework versions - Transformers 4.20.1 - Pytorch 1.12.1 - Datasets 2.9.0 - Tokenizers 0.11.0