Edit model card

roberta-large-ner-ghtk-cs-add-3label-10-new-data-3090-14Sep-1

This model was trained from scratch on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.2655
  • Tk: {'precision': 0.7614678899082569, 'recall': 0.7155172413793104, 'f1': 0.7377777777777778, 'number': 116}
  • A: {'precision': 0.9242761692650334, 'recall': 0.962877030162413, 'f1': 0.9431818181818182, 'number': 431}
  • Gày: {'precision': 0.7560975609756098, 'recall': 0.9117647058823529, 'f1': 0.8266666666666665, 'number': 34}
  • Gày trừu tượng: {'precision': 0.8747474747474747, 'recall': 0.8872950819672131, 'f1': 0.8809766022380467, 'number': 488}
  • Iền: {'precision': 0.7115384615384616, 'recall': 0.9487179487179487, 'f1': 0.8131868131868132, 'number': 39}
  • Iờ: {'precision': 0.5961538461538461, 'recall': 0.8157894736842105, 'f1': 0.6888888888888889, 'number': 38}
  • Ã đơn: {'precision': 0.8743455497382199, 'recall': 0.8226600985221675, 'f1': 0.8477157360406091, 'number': 203}
  • Đt: {'precision': 0.941747572815534, 'recall': 0.9943052391799544, 'f1': 0.9673130193905817, 'number': 878}
  • Đt trừu tượng: {'precision': 0.7674418604651163, 'recall': 0.8497854077253219, 'f1': 0.8065173116089615, 'number': 233}
  • Overall Precision: 0.8811
  • Overall Recall: 0.9220
  • Overall F1: 0.9011
  • Overall Accuracy: 0.9590

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2.5e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 10

Training results

Training Loss Epoch Step Validation Loss Tk A Gày Gày trừu tượng Iền Iờ Ã đơn Đt Đt trừu tượng Overall Precision Overall Recall Overall F1 Overall Accuracy
No log 1.0 291 0.1747 {'precision': 0.5, 'recall': 0.11206896551724138, 'f1': 0.18309859154929578, 'number': 116} {'precision': 0.9262672811059908, 'recall': 0.9327146171693735, 'f1': 0.9294797687861271, 'number': 431} {'precision': 0.7297297297297297, 'recall': 0.7941176470588235, 'f1': 0.7605633802816901, 'number': 34} {'precision': 0.8788501026694046, 'recall': 0.8770491803278688, 'f1': 0.8779487179487179, 'number': 488} {'precision': 0.6607142857142857, 'recall': 0.9487179487179487, 'f1': 0.7789473684210526, 'number': 39} {'precision': 0.5483870967741935, 'recall': 0.8947368421052632, 'f1': 0.6799999999999999, 'number': 38} {'precision': 0.9181286549707602, 'recall': 0.7733990147783252, 'f1': 0.8395721925133691, 'number': 203} {'precision': 0.9135932560590094, 'recall': 0.9874715261958997, 'f1': 0.9490968801313628, 'number': 878} {'precision': 0.7862595419847328, 'recall': 0.8841201716738197, 'f1': 0.8323232323232322, 'number': 233} 0.8740 0.8825 0.8782 0.9544
0.1055 2.0 582 0.1941 {'precision': 0.8113207547169812, 'recall': 0.3706896551724138, 'f1': 0.5088757396449705, 'number': 116} {'precision': 0.922566371681416, 'recall': 0.9675174013921114, 'f1': 0.9445073612684032, 'number': 431} {'precision': 0.725, 'recall': 0.8529411764705882, 'f1': 0.7837837837837837, 'number': 34} {'precision': 0.907725321888412, 'recall': 0.8668032786885246, 'f1': 0.8867924528301887, 'number': 488} {'precision': 0.6730769230769231, 'recall': 0.8974358974358975, 'f1': 0.7692307692307692, 'number': 39} {'precision': 0.5961538461538461, 'recall': 0.8157894736842105, 'f1': 0.6888888888888889, 'number': 38} {'precision': 0.8838709677419355, 'recall': 0.6748768472906403, 'f1': 0.76536312849162, 'number': 203} {'precision': 0.8977505112474438, 'recall': 1.0, 'f1': 0.9461206896551724, 'number': 878} {'precision': 0.7298245614035088, 'recall': 0.8927038626609443, 'f1': 0.8030888030888031, 'number': 233} 0.8689 0.8947 0.8816 0.9542
0.1055 3.0 873 0.1802 {'precision': 0.7377049180327869, 'recall': 0.7758620689655172, 'f1': 0.7563025210084033, 'number': 116} {'precision': 0.9279279279279279, 'recall': 0.9559164733178654, 'f1': 0.9417142857142857, 'number': 431} {'precision': 0.7674418604651163, 'recall': 0.9705882352941176, 'f1': 0.8571428571428571, 'number': 34} {'precision': 0.8688845401174168, 'recall': 0.9098360655737705, 'f1': 0.8888888888888888, 'number': 488} {'precision': 0.660377358490566, 'recall': 0.8974358974358975, 'f1': 0.7608695652173912, 'number': 39} {'precision': 0.625, 'recall': 0.7894736842105263, 'f1': 0.6976744186046512, 'number': 38} {'precision': 0.8771929824561403, 'recall': 0.7389162561576355, 'f1': 0.8021390374331551, 'number': 203} {'precision': 0.9386437029063509, 'recall': 0.9931662870159453, 'f1': 0.9651355838406198, 'number': 878} {'precision': 0.752895752895753, 'recall': 0.8369098712446352, 'f1': 0.7926829268292684, 'number': 233} 0.8764 0.9191 0.8972 0.9558
0.0456 4.0 1164 0.2306 {'precision': 0.75, 'recall': 0.6724137931034483, 'f1': 0.709090909090909, 'number': 116} {'precision': 0.9269911504424779, 'recall': 0.9721577726218097, 'f1': 0.9490373725934316, 'number': 431} {'precision': 0.6888888888888889, 'recall': 0.9117647058823529, 'f1': 0.7848101265822784, 'number': 34} {'precision': 0.8822355289421158, 'recall': 0.9057377049180327, 'f1': 0.8938321536905965, 'number': 488} {'precision': 0.7169811320754716, 'recall': 0.9743589743589743, 'f1': 0.8260869565217391, 'number': 39} {'precision': 0.5538461538461539, 'recall': 0.9473684210526315, 'f1': 0.6990291262135921, 'number': 38} {'precision': 0.7678571428571429, 'recall': 0.8472906403940886, 'f1': 0.8056206088992974, 'number': 203} {'precision': 0.9306296691568837, 'recall': 0.9931662870159453, 'f1': 0.9608815426997246, 'number': 878} {'precision': 0.6027777777777777, 'recall': 0.9313304721030042, 'f1': 0.7318718381112984, 'number': 233} 0.8409 0.9370 0.8864 0.9518
0.0456 5.0 1455 0.1873 {'precision': 0.7596153846153846, 'recall': 0.6810344827586207, 'f1': 0.7181818181818181, 'number': 116} {'precision': 0.9303370786516854, 'recall': 0.9605568445475638, 'f1': 0.9452054794520548, 'number': 431} {'precision': 0.631578947368421, 'recall': 0.7058823529411765, 'f1': 0.6666666666666667, 'number': 34} {'precision': 0.9113924050632911, 'recall': 0.8852459016393442, 'f1': 0.898128898128898, 'number': 488} {'precision': 0.75, 'recall': 0.9230769230769231, 'f1': 0.8275862068965517, 'number': 39} {'precision': 0.6037735849056604, 'recall': 0.8421052631578947, 'f1': 0.7032967032967034, 'number': 38} {'precision': 0.8624338624338624, 'recall': 0.8029556650246306, 'f1': 0.8316326530612244, 'number': 203} {'precision': 0.9477693144722524, 'recall': 0.9920273348519362, 'f1': 0.9693934335002782, 'number': 878} {'precision': 0.8391304347826087, 'recall': 0.8283261802575107, 'f1': 0.8336933045356373, 'number': 233} 0.8976 0.9122 0.9048 0.9596
0.028 6.0 1746 0.2340 {'precision': 0.8, 'recall': 0.6551724137931034, 'f1': 0.7203791469194314, 'number': 116} {'precision': 0.9153674832962138, 'recall': 0.9535962877030162, 'f1': 0.934090909090909, 'number': 431} {'precision': 0.7209302325581395, 'recall': 0.9117647058823529, 'f1': 0.8051948051948051, 'number': 34} {'precision': 0.8698224852071006, 'recall': 0.9036885245901639, 'f1': 0.8864321608040202, 'number': 488} {'precision': 0.6851851851851852, 'recall': 0.9487179487179487, 'f1': 0.7956989247311828, 'number': 39} {'precision': 0.55, 'recall': 0.868421052631579, 'f1': 0.673469387755102, 'number': 38} {'precision': 0.8177339901477833, 'recall': 0.8177339901477833, 'f1': 0.8177339901477833, 'number': 203} {'precision': 0.9267515923566879, 'recall': 0.9943052391799544, 'f1': 0.9593406593406594, 'number': 878} {'precision': 0.8305084745762712, 'recall': 0.8412017167381974, 'f1': 0.835820895522388, 'number': 233} 0.8745 0.9203 0.8968 0.9565
0.0154 7.0 2037 0.2404 {'precision': 0.7614678899082569, 'recall': 0.7155172413793104, 'f1': 0.7377777777777778, 'number': 116} {'precision': 0.9285714285714286, 'recall': 0.9651972157772621, 'f1': 0.9465301478953356, 'number': 431} {'precision': 0.75, 'recall': 0.8823529411764706, 'f1': 0.8108108108108107, 'number': 34} {'precision': 0.8969072164948454, 'recall': 0.8913934426229508, 'f1': 0.894141829393628, 'number': 488} {'precision': 0.6981132075471698, 'recall': 0.9487179487179487, 'f1': 0.8043478260869565, 'number': 39} {'precision': 0.5818181818181818, 'recall': 0.8421052631578947, 'f1': 0.6881720430107526, 'number': 38} {'precision': 0.8426395939086294, 'recall': 0.8177339901477833, 'f1': 0.83, 'number': 203} {'precision': 0.9520697167755992, 'recall': 0.9954441913439636, 'f1': 0.9732739420935412, 'number': 878} {'precision': 0.7587548638132295, 'recall': 0.8369098712446352, 'f1': 0.7959183673469388, 'number': 233} 0.8852 0.9220 0.9032 0.9578
0.0154 8.0 2328 0.2621 {'precision': 0.75, 'recall': 0.6724137931034483, 'f1': 0.709090909090909, 'number': 116} {'precision': 0.9223946784922394, 'recall': 0.9651972157772621, 'f1': 0.9433106575963719, 'number': 431} {'precision': 0.75, 'recall': 0.8823529411764706, 'f1': 0.8108108108108107, 'number': 34} {'precision': 0.8657587548638133, 'recall': 0.9118852459016393, 'f1': 0.8882235528942116, 'number': 488} {'precision': 0.6851851851851852, 'recall': 0.9487179487179487, 'f1': 0.7956989247311828, 'number': 39} {'precision': 0.576271186440678, 'recall': 0.8947368421052632, 'f1': 0.7010309278350517, 'number': 38} {'precision': 0.8512820512820513, 'recall': 0.8177339901477833, 'f1': 0.8341708542713567, 'number': 203} {'precision': 0.926829268292683, 'recall': 0.9954441913439636, 'f1': 0.9599121361889073, 'number': 878} {'precision': 0.7364620938628159, 'recall': 0.8755364806866953, 'f1': 0.8, 'number': 233} 0.8661 0.9285 0.8962 0.9560
0.0074 9.0 2619 0.2572 {'precision': 0.7614678899082569, 'recall': 0.7155172413793104, 'f1': 0.7377777777777778, 'number': 116} {'precision': 0.9183222958057395, 'recall': 0.9651972157772621, 'f1': 0.9411764705882353, 'number': 431} {'precision': 0.7560975609756098, 'recall': 0.9117647058823529, 'f1': 0.8266666666666665, 'number': 34} {'precision': 0.8772635814889336, 'recall': 0.8934426229508197, 'f1': 0.8852791878172589, 'number': 488} {'precision': 0.6981132075471698, 'recall': 0.9487179487179487, 'f1': 0.8043478260869565, 'number': 39} {'precision': 0.5961538461538461, 'recall': 0.8157894736842105, 'f1': 0.6888888888888889, 'number': 38} {'precision': 0.8736842105263158, 'recall': 0.8177339901477833, 'f1': 0.8447837150127226, 'number': 203} {'precision': 0.941747572815534, 'recall': 0.9943052391799544, 'f1': 0.9673130193905817, 'number': 878} {'precision': 0.7704280155642024, 'recall': 0.8497854077253219, 'f1': 0.8081632653061225, 'number': 233} 0.8806 0.9232 0.9014 0.9593
0.0074 10.0 2910 0.2655 {'precision': 0.7614678899082569, 'recall': 0.7155172413793104, 'f1': 0.7377777777777778, 'number': 116} {'precision': 0.9242761692650334, 'recall': 0.962877030162413, 'f1': 0.9431818181818182, 'number': 431} {'precision': 0.7560975609756098, 'recall': 0.9117647058823529, 'f1': 0.8266666666666665, 'number': 34} {'precision': 0.8747474747474747, 'recall': 0.8872950819672131, 'f1': 0.8809766022380467, 'number': 488} {'precision': 0.7115384615384616, 'recall': 0.9487179487179487, 'f1': 0.8131868131868132, 'number': 39} {'precision': 0.5961538461538461, 'recall': 0.8157894736842105, 'f1': 0.6888888888888889, 'number': 38} {'precision': 0.8743455497382199, 'recall': 0.8226600985221675, 'f1': 0.8477157360406091, 'number': 203} {'precision': 0.941747572815534, 'recall': 0.9943052391799544, 'f1': 0.9673130193905817, 'number': 878} {'precision': 0.7674418604651163, 'recall': 0.8497854077253219, 'f1': 0.8065173116089615, 'number': 233} 0.8811 0.9220 0.9011 0.9590

Framework versions

  • Transformers 4.44.0
  • Pytorch 2.3.1+cu121
  • Datasets 2.19.1
  • Tokenizers 0.19.1
Downloads last month
0
Safetensors
Model size
559M params
Tensor type
F32
·
Inference API
Unable to determine this model's library. Check the docs .