almaghrabima commited on
Commit
bdf2f72
1 Parent(s): 222db1f

End of training

Browse files
Files changed (2) hide show
  1. README.md +30 -36
  2. pytorch_model.bin +1 -1
README.md CHANGED
@@ -1,5 +1,6 @@
1
  ---
2
  license: mit
 
3
  tags:
4
  - generated_from_trainer
5
  metrics:
@@ -10,13 +11,6 @@ metrics:
10
  model-index:
11
  - name: ner_column_TQ
12
  results: []
13
- language:
14
- - en
15
- widget:
16
- - india 0S0308Z8 trudeau 3000 Ravensburger Hamnoy, Lofoten of gold bestseller 620463000001
17
- - other china lc waikiki mağazacilik hi̇zmetleri̇ ti̇c aş 630140000000 hilti 6204699090_BD 55L Toaster Oven with Double Glass
18
- - 611020000001 italy Apparel other games 9W1964Z8 debenhams guangzhou hec fashion leather co ltd
19
-
20
  ---
21
 
22
  <!-- This model card has been generated automatically according to the information the Trainer had access to. You
@@ -26,11 +20,11 @@ should probably proofread and complete it, then remove this comment. -->
26
 
27
  This model is a fine-tuned version of [Gladiator/microsoft-deberta-v3-large_ner_conll2003](https://huggingface.co/Gladiator/microsoft-deberta-v3-large_ner_conll2003) on the None dataset.
28
  It achieves the following results on the evaluation set:
29
- - Loss: 0.2111
30
- - Precision: 0.8593
31
- - Recall: 0.8587
32
- - F1: 0.8590
33
- - Accuracy: 0.9163
34
 
35
  ## Model description
36
 
@@ -61,31 +55,31 @@ The following hyperparameters were used during training:
61
 
62
  | Training Loss | Epoch | Step | Validation Loss | Precision | Recall | F1 | Accuracy |
63
  |:-------------:|:-----:|:-----:|:---------------:|:---------:|:------:|:------:|:--------:|
64
- | No log | 1.0 | 702 | 0.1938 | 0.7778 | 0.7851 | 0.7815 | 0.8957 |
65
- | 0.3587 | 2.0 | 1404 | 0.1562 | 0.8216 | 0.8219 | 0.8217 | 0.9098 |
66
- | 0.1645 | 3.0 | 2106 | 0.1472 | 0.8161 | 0.8268 | 0.8214 | 0.9114 |
67
- | 0.1645 | 4.0 | 2808 | 0.1528 | 0.8357 | 0.8195 | 0.8275 | 0.9097 |
68
- | 0.1372 | 5.0 | 3510 | 0.1411 | 0.8301 | 0.8349 | 0.8325 | 0.9141 |
69
- | 0.1259 | 6.0 | 4212 | 0.1396 | 0.8341 | 0.8431 | 0.8386 | 0.9149 |
70
- | 0.1259 | 7.0 | 4914 | 0.1470 | 0.8178 | 0.8323 | 0.8250 | 0.9126 |
71
- | 0.1205 | 8.0 | 5616 | 0.1413 | 0.8421 | 0.8480 | 0.8451 | 0.9156 |
72
- | 0.1152 | 9.0 | 6318 | 0.1417 | 0.8342 | 0.8481 | 0.8411 | 0.9158 |
73
- | 0.1126 | 10.0 | 7020 | 0.1475 | 0.8427 | 0.8493 | 0.8460 | 0.9154 |
74
- | 0.1126 | 11.0 | 7722 | 0.1490 | 0.8477 | 0.8510 | 0.8493 | 0.9155 |
75
- | 0.108 | 12.0 | 8424 | 0.1535 | 0.8511 | 0.8540 | 0.8526 | 0.9160 |
76
- | 0.1035 | 13.0 | 9126 | 0.1569 | 0.8515 | 0.8552 | 0.8533 | 0.9160 |
77
- | 0.1035 | 14.0 | 9828 | 0.1677 | 0.8530 | 0.8537 | 0.8534 | 0.9158 |
78
- | 0.097 | 15.0 | 10530 | 0.1721 | 0.8549 | 0.8557 | 0.8553 | 0.9159 |
79
- | 0.0912 | 16.0 | 11232 | 0.1822 | 0.8573 | 0.8574 | 0.8573 | 0.9165 |
80
- | 0.0912 | 17.0 | 11934 | 0.1969 | 0.8577 | 0.8578 | 0.8577 | 0.9158 |
81
- | 0.0854 | 18.0 | 12636 | 0.1969 | 0.8597 | 0.8587 | 0.8592 | 0.9165 |
82
- | 0.08 | 19.0 | 13338 | 0.2035 | 0.8587 | 0.8587 | 0.8587 | 0.9165 |
83
- | 0.0768 | 20.0 | 14040 | 0.2111 | 0.8593 | 0.8587 | 0.8590 | 0.9163 |
84
 
85
 
86
  ### Framework versions
87
 
88
- - Transformers 4.30.2
89
- - Pytorch 1.13.1+cu116
90
- - Datasets 2.13.2
91
- - Tokenizers 0.13.3
 
1
  ---
2
  license: mit
3
+ base_model: Gladiator/microsoft-deberta-v3-large_ner_conll2003
4
  tags:
5
  - generated_from_trainer
6
  metrics:
 
11
  model-index:
12
  - name: ner_column_TQ
13
  results: []
 
 
 
 
 
 
 
14
  ---
15
 
16
  <!-- This model card has been generated automatically according to the information the Trainer had access to. You
 
20
 
21
  This model is a fine-tuned version of [Gladiator/microsoft-deberta-v3-large_ner_conll2003](https://huggingface.co/Gladiator/microsoft-deberta-v3-large_ner_conll2003) on the None dataset.
22
  It achieves the following results on the evaluation set:
23
+ - Loss: 0.1949
24
+ - Precision: 0.8546
25
+ - Recall: 0.8533
26
+ - F1: 0.8540
27
+ - Accuracy: 0.9154
28
 
29
  ## Model description
30
 
 
55
 
56
  | Training Loss | Epoch | Step | Validation Loss | Precision | Recall | F1 | Accuracy |
57
  |:-------------:|:-----:|:-----:|:---------------:|:---------:|:------:|:------:|:--------:|
58
+ | No log | 1.0 | 702 | 0.2342 | 0.7774 | 0.7496 | 0.7632 | 0.8833 |
59
+ | 0.369 | 2.0 | 1404 | 0.1708 | 0.8050 | 0.8048 | 0.8049 | 0.9033 |
60
+ | 0.1681 | 3.0 | 2106 | 0.1646 | 0.8007 | 0.8078 | 0.8043 | 0.9054 |
61
+ | 0.1681 | 4.0 | 2808 | 0.1469 | 0.8250 | 0.8335 | 0.8292 | 0.9133 |
62
+ | 0.14 | 5.0 | 3510 | 0.1465 | 0.8235 | 0.8345 | 0.8290 | 0.9137 |
63
+ | 0.1279 | 6.0 | 4212 | 0.1517 | 0.8165 | 0.8323 | 0.8244 | 0.9127 |
64
+ | 0.1279 | 7.0 | 4914 | 0.1474 | 0.8224 | 0.8370 | 0.8297 | 0.9138 |
65
+ | 0.1212 | 8.0 | 5616 | 0.1500 | 0.8255 | 0.8409 | 0.8331 | 0.9141 |
66
+ | 0.1165 | 9.0 | 6318 | 0.1545 | 0.8297 | 0.8390 | 0.8343 | 0.9142 |
67
+ | 0.1138 | 10.0 | 7020 | 0.1590 | 0.8342 | 0.8467 | 0.8404 | 0.9150 |
68
+ | 0.1138 | 11.0 | 7722 | 0.1588 | 0.8383 | 0.8474 | 0.8428 | 0.9156 |
69
+ | 0.1099 | 12.0 | 8424 | 0.1547 | 0.8425 | 0.8446 | 0.8435 | 0.9156 |
70
+ | 0.1071 | 13.0 | 9126 | 0.1565 | 0.8475 | 0.8471 | 0.8473 | 0.9164 |
71
+ | 0.1071 | 14.0 | 9828 | 0.1625 | 0.8440 | 0.8489 | 0.8464 | 0.9156 |
72
+ | 0.1031 | 15.0 | 10530 | 0.1680 | 0.8486 | 0.8510 | 0.8498 | 0.9160 |
73
+ | 0.0992 | 16.0 | 11232 | 0.1722 | 0.8529 | 0.8505 | 0.8517 | 0.9156 |
74
+ | 0.0992 | 17.0 | 11934 | 0.1771 | 0.8527 | 0.8529 | 0.8528 | 0.9159 |
75
+ | 0.094 | 18.0 | 12636 | 0.1862 | 0.8555 | 0.8531 | 0.8543 | 0.9159 |
76
+ | 0.0892 | 19.0 | 13338 | 0.1884 | 0.8534 | 0.8534 | 0.8534 | 0.9156 |
77
+ | 0.086 | 20.0 | 14040 | 0.1949 | 0.8546 | 0.8533 | 0.8540 | 0.9154 |
78
 
79
 
80
  ### Framework versions
81
 
82
+ - Transformers 4.33.2
83
+ - Pytorch 2.0.1+cu117
84
+ - Datasets 2.14.5
85
+ - Tokenizers 0.13.3
pytorch_model.bin CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:2b81692eda5af51d8d9ffd4b1ec22dc290051cbed1753330ee4ee291ee6f341b
3
  size 1736213293
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:9c4d3c251a438df98883b8444105722bd5ec1443e679e055dad6efdf8d3064ac
3
  size 1736213293