Archana02 commited on
Commit
0f20a25
1 Parent(s): 1ca243a

End of training

Browse files
README.md CHANGED
@@ -14,24 +14,20 @@ should probably proofread and complete it, then remove this comment. -->
14
 
15
  This model is a fine-tuned version of [microsoft/layoutlm-base-uncased](https://huggingface.co/microsoft/layoutlm-base-uncased) on the None dataset.
16
  It achieves the following results on the evaluation set:
17
- - Loss: 3.0586
18
- - Ame: {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 3}
19
- - Anguages: {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 0}
20
- - Ducation: {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 2}
21
- - Echnical skills: {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 2}
22
- - Escriptions: {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 1}
23
- - Esignation: {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 4}
24
- - Hone number: {'precision': 0.75, 'recall': 1.0, 'f1': 0.8571428571428571, 'number': 3}
25
- - Ithub: {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 1}
26
- - Mail: {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 2}
27
- - Ocation: {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 1}
28
- - Ork experience company: {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 4}
29
- - Ork experience role: {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 4}
30
- - Rojects: {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 1}
31
- - Overall Precision: 0.12
32
- - Overall Recall: 0.1071
33
- - Overall F1: 0.1132
34
- - Overall Accuracy: 0.1429
35
 
36
  ## Model description
37
 
@@ -60,16 +56,16 @@ The following hyperparameters were used during training:
60
 
61
  ### Training results
62
 
63
- | Training Loss | Epoch | Step | Validation Loss | Ame | Anguages | Ducation | Ear of experience | Echnical skills | Escriptions | Esignation | Hone number | Ithub | Mail | Ob | Ocation | Ork experience company | Ork experience role | Rojects | Overall Precision | Overall Recall | Overall F1 | Overall Accuracy |
64
- |:-------------:|:-----:|:----:|:---------------:|:---------------------------------------------------------:|:---------------------------------------------------------:|:---------------------------------------------------------:|:---------------------------------------------------------:|:---------------------------------------------------------:|:---------------------------------------------------------:|:---------------------------------------------------------:|:------------------------------------------------------------------------------------------------------:|:---------------------------------------------------------:|:---------------------------------------------------------:|:---------------------------------------------------------:|:---------------------------------------------------------:|:---------------------------------------------------------:|:---------------------------------------------------------:|:---------------------------------------------------------:|:-----------------:|:--------------:|:----------:|:----------------:|
65
- | 3.2249 | 1.0 | 3 | 3.1464 | {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 3} | {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 0} | {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 2} | {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 0} | {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 2} | {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 1} | {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 4} | {'precision': 1.0, 'recall': 0.6666666666666666, 'f1': 0.8, 'number': 3} | {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 1} | {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 2} | {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 0} | {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 1} | {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 4} | {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 4} | {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 1} | 0.0909 | 0.0714 | 0.08 | 0.1071 |
66
- | 2.9723 | 2.0 | 6 | 3.0839 | {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 3} | {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 0} | {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 2} | {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 0} | {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 2} | {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 1} | {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 4} | {'precision': 0.6666666666666666, 'recall': 0.6666666666666666, 'f1': 0.6666666666666666, 'number': 3} | {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 1} | {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 2} | {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 1} | {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 4} | {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 4} | {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 1} | 0.0769 | 0.0714 | 0.0741 | 0.0714 |
67
- | 2.8482 | 3.0 | 9 | 3.0586 | {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 3} | {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 0} | {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 2} | {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 2} | {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 1} | {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 4} | {'precision': 0.75, 'recall': 1.0, 'f1': 0.8571428571428571, 'number': 3}| {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 1} | {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 2} | {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 1} | {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 4} | {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 4} | {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 1} | 0.12 | 0.1071 | 0.1132 | 0.1429 |
68
 
69
 
70
  ### Framework versions
71
 
72
- - Transformers 4.34.1
73
  - Pytorch 2.1.0+cu118
74
  - Datasets 2.14.6
75
  - Tokenizers 0.14.1
 
14
 
15
  This model is a fine-tuned version of [microsoft/layoutlm-base-uncased](https://huggingface.co/microsoft/layoutlm-base-uncased) on the None dataset.
16
  It achieves the following results on the evaluation set:
17
+ - Loss: 2.7407
18
+ - Education: {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 5}
19
+ - Email: {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 2}
20
+ - Github: {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 0}
21
+ - Location: {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 3}
22
+ - Name: {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 2}
23
+ - Name : {'precision': 0.2, 'recall': 0.5, 'f1': 0.28571428571428575, 'number': 2}
24
+ - Phone Number: {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 2}
25
+ - Soft Skills: {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 0}
26
+ - Technical Skills: {'precision': 0.2, 'recall': 0.35714285714285715, 'f1': 0.25641025641025644, 'number': 14}
27
+ - Overall Precision: 0.1176
28
+ - Overall Recall: 0.2
29
+ - Overall F1: 0.1481
30
+ - Overall Accuracy: 0.1475
 
 
 
 
31
 
32
  ## Model description
33
 
 
56
 
57
  ### Training results
58
 
59
+ | Training Loss | Epoch | Step | Validation Loss | Education | Email | Github | Linkedin | Location | Name | Name | Phone Number | Soft Skills | Technical Skills | Overall Precision | Overall Recall | Overall F1 | Overall Accuracy |
60
+ |:-------------:|:-----:|:----:|:---------------:|:---------------------------------------------------------:|:---------------------------------------------------------:|:---------------------------------------------------------:|:---------------------------------------------------------:|:---------------------------------------------------------:|:---------------------------------------------------------:|:-----------------------------------------------------------------------------------------:|:---------------------------------------------------------:|:---------------------------------------------------------:|:----------------------------------------------------------------------------------------------------------:|:-----------------:|:--------------:|:----------:|:----------------:|
61
+ | 2.9387 | 1.0 | 2 | 2.8701 | {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 5} | {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 2} | {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 0} | {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 0} | {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 3} | {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 2} | {'precision': 0.14285714285714285, 'recall': 0.5, 'f1': 0.22222222222222224, 'number': 2} | {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 2} | {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 0} | {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 14} | 0.0185 | 0.0333 | 0.0238 | 0.0328 |
62
+ | 2.6716 | 2.0 | 4 | 2.7798 | {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 5} | {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 2} | {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 0} | {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 3} | {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 2} | {'precision': 0.2, 'recall': 0.5, 'f1': 0.28571428571428575, 'number': 2}| {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 2} | {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 0} | {'precision': 0.10526315789473684, 'recall': 0.14285714285714285, 'f1': 0.12121212121212122, 'number': 14}| 0.0612 | 0.1 | 0.0759 | 0.1311 |
63
+ | 2.5524 | 3.0 | 6 | 2.7407 | {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 5} | {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 2} | {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 0} | {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 3} | {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 2} | {'precision': 0.2, 'recall': 0.5, 'f1': 0.28571428571428575, 'number': 2}| {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 2} | {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 0} | {'precision': 0.2, 'recall': 0.35714285714285715, 'f1': 0.25641025641025644, 'number': 14}| 0.1176 | 0.2 | 0.1481 | 0.1475 |
64
 
65
 
66
  ### Framework versions
67
 
68
+ - Transformers 4.35.0
69
  - Pytorch 2.1.0+cu118
70
  - Datasets 2.14.6
71
  - Tokenizers 0.14.1
logs/events.out.tfevents.1699071654.4465f2ac9a17.457.0 CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:1fb67f1e49e9462959f4cdfb9e2005685a42f08c3cd7fcbdc7a91296cd6b61b4
3
- size 6324
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:7c4c084804dbd5f35c80198ad3e15d06c344904cce059202da6a5df08db825fb
3
+ size 7321
model.safetensors CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:987db915de20dc7918e5e080b88575beef8ff375679bd77f4580e6eafb942789
3
  size 450592048
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:63e1df0270e2fcdca677ba7960b3bc61b1980a55e7a8b01906c662c016f658e9
3
  size 450592048