Edit model card

roberta-large-ner-ghtk-cs-add-2label-20-new-data-3090-16Sep-1

This model was trained from scratch on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.2505
  • Tk: {'precision': 0.8421052631578947, 'recall': 0.6896551724137931, 'f1': 0.7582938388625593, 'number': 116}
  • A: {'precision': 0.9427917620137299, 'recall': 0.9559164733178654, 'f1': 0.9493087557603687, 'number': 431}
  • Gày: {'precision': 0.7272727272727273, 'recall': 0.9411764705882353, 'f1': 0.8205128205128205, 'number': 34}
  • Gày trừu tượng: {'precision': 0.9036885245901639, 'recall': 0.9036885245901639, 'f1': 0.9036885245901639, 'number': 488}
  • Iờ: {'precision': 0.6521739130434783, 'recall': 0.7894736842105263, 'f1': 0.7142857142857143, 'number': 38}
  • Ã đơn: {'precision': 0.8557213930348259, 'recall': 0.8472906403940886, 'f1': 0.8514851485148515, 'number': 203}
  • Đt: {'precision': 0.9295624332977588, 'recall': 0.9920273348519362, 'f1': 0.9597796143250688, 'number': 878}
  • Đt trừu tượng: {'precision': 0.8022813688212928, 'recall': 0.9055793991416309, 'f1': 0.8508064516129031, 'number': 233}
  • Overall Precision: 0.8957
  • Overall Recall: 0.9290
  • Overall F1: 0.9120
  • Overall Accuracy: 0.9623

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2.5e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 10

Training results

Training Loss Epoch Step Validation Loss Tk A Gày Gày trừu tượng Iờ Ã đơn Đt Đt trừu tượng Overall Precision Overall Recall Overall F1 Overall Accuracy
No log 1.0 308 0.1572 {'precision': 0.7023809523809523, 'recall': 0.5086206896551724, 'f1': 0.59, 'number': 116} {'precision': 0.8951781970649895, 'recall': 0.9907192575406032, 'f1': 0.9405286343612335, 'number': 431} {'precision': 0.717391304347826, 'recall': 0.9705882352941176, 'f1': 0.825, 'number': 34} {'precision': 0.8737864077669902, 'recall': 0.9221311475409836, 'f1': 0.8973080757726819, 'number': 488} {'precision': 0.6271186440677966, 'recall': 0.9736842105263158, 'f1': 0.7628865979381443, 'number': 38} {'precision': 0.8446601941747572, 'recall': 0.8571428571428571, 'f1': 0.8508557457212713, 'number': 203} {'precision': 0.9123173277661796, 'recall': 0.9954441913439636, 'f1': 0.9520697167755992, 'number': 878} {'precision': 0.8171206225680934, 'recall': 0.9012875536480687, 'f1': 0.8571428571428572, 'number': 233} 0.8701 0.9352 0.9015 0.9587
0.085 2.0 616 0.2070 {'precision': 0.6864406779661016, 'recall': 0.6982758620689655, 'f1': 0.6923076923076923, 'number': 116} {'precision': 0.9284116331096197, 'recall': 0.962877030162413, 'f1': 0.9453302961275627, 'number': 431} {'precision': 0.7045454545454546, 'recall': 0.9117647058823529, 'f1': 0.794871794871795, 'number': 34} {'precision': 0.8661567877629063, 'recall': 0.9282786885245902, 'f1': 0.8961424332344212, 'number': 488} {'precision': 0.5238095238095238, 'recall': 0.868421052631579, 'f1': 0.6534653465346535, 'number': 38} {'precision': 0.8086124401913876, 'recall': 0.8325123152709359, 'f1': 0.8203883495145632, 'number': 203} {'precision': 0.9083333333333333, 'recall': 0.9931662870159453, 'f1': 0.9488574537540806, 'number': 878} {'precision': 0.7669172932330827, 'recall': 0.8755364806866953, 'f1': 0.8176352705410822, 'number': 233} 0.8586 0.9327 0.8941 0.9535
0.085 3.0 924 0.2595 {'precision': 0.7307692307692307, 'recall': 0.16379310344827586, 'f1': 0.2676056338028169, 'number': 116} {'precision': 0.9398148148148148, 'recall': 0.9419953596287703, 'f1': 0.9409038238702202, 'number': 431} {'precision': 0.6739130434782609, 'recall': 0.9117647058823529, 'f1': 0.775, 'number': 34} {'precision': 0.8844621513944223, 'recall': 0.9098360655737705, 'f1': 0.8969696969696971, 'number': 488} {'precision': 0.6, 'recall': 0.868421052631579, 'f1': 0.7096774193548387, 'number': 38} {'precision': 0.7665198237885462, 'recall': 0.8571428571428571, 'f1': 0.8093023255813954, 'number': 203} {'precision': 0.8520710059171598, 'recall': 0.9840546697038725, 'f1': 0.9133192389006343, 'number': 878} {'precision': 0.8277310924369747, 'recall': 0.8454935622317596, 'f1': 0.8365180467091294, 'number': 233} 0.8535 0.8955 0.8740 0.9488
0.0427 4.0 1232 0.2174 {'precision': 0.7868852459016393, 'recall': 0.41379310344827586, 'f1': 0.5423728813559322, 'number': 116} {'precision': 0.9254385964912281, 'recall': 0.9791183294663574, 'f1': 0.9515219842164601, 'number': 431} {'precision': 0.5952380952380952, 'recall': 0.7352941176470589, 'f1': 0.6578947368421053, 'number': 34} {'precision': 0.8595825426944972, 'recall': 0.9282786885245902, 'f1': 0.8926108374384237, 'number': 488} {'precision': 0.6415094339622641, 'recall': 0.8947368421052632, 'f1': 0.7472527472527473, 'number': 38} {'precision': 0.8963730569948186, 'recall': 0.8522167487684729, 'f1': 0.8737373737373737, 'number': 203} {'precision': 0.8873096446700508, 'recall': 0.9954441913439636, 'f1': 0.9382716049382716, 'number': 878} {'precision': 0.7615384615384615, 'recall': 0.8497854077253219, 'f1': 0.8032454361054767, 'number': 233} 0.8642 0.9199 0.8912 0.9581
0.0259 5.0 1540 0.2243 {'precision': 0.8222222222222222, 'recall': 0.6379310344827587, 'f1': 0.7184466019417477, 'number': 116} {'precision': 0.9280898876404494, 'recall': 0.9582366589327146, 'f1': 0.9429223744292237, 'number': 431} {'precision': 0.725, 'recall': 0.8529411764705882, 'f1': 0.7837837837837837, 'number': 34} {'precision': 0.8967611336032388, 'recall': 0.9077868852459017, 'f1': 0.9022403258655805, 'number': 488} {'precision': 0.7272727272727273, 'recall': 0.631578947368421, 'f1': 0.676056338028169, 'number': 38} {'precision': 0.9248554913294798, 'recall': 0.7881773399014779, 'f1': 0.851063829787234, 'number': 203} {'precision': 0.955456570155902, 'recall': 0.9772209567198178, 'f1': 0.9662162162162161, 'number': 878} {'precision': 0.7727272727272727, 'recall': 0.8755364806866953, 'f1': 0.8209255533199195, 'number': 233} 0.9048 0.9108 0.9078 0.9619
0.0259 6.0 1848 0.2213 {'precision': 0.7684210526315789, 'recall': 0.6293103448275862, 'f1': 0.6919431279620853, 'number': 116} {'precision': 0.9261744966442953, 'recall': 0.9605568445475638, 'f1': 0.9430523917995444, 'number': 431} {'precision': 0.7441860465116279, 'recall': 0.9411764705882353, 'f1': 0.8311688311688312, 'number': 34} {'precision': 0.9057377049180327, 'recall': 0.9057377049180327, 'f1': 0.9057377049180327, 'number': 488} {'precision': 0.5882352941176471, 'recall': 0.7894736842105263, 'f1': 0.6741573033707866, 'number': 38} {'precision': 0.8882978723404256, 'recall': 0.8226600985221675, 'f1': 0.8542199488491049, 'number': 203} {'precision': 0.9227513227513228, 'recall': 0.9931662870159453, 'f1': 0.9566648381788261, 'number': 878} {'precision': 0.8125, 'recall': 0.8927038626609443, 'f1': 0.8507157464212679, 'number': 233} 0.8906 0.9244 0.9072 0.9608
0.017 7.0 2156 0.2342 {'precision': 0.7916666666666666, 'recall': 0.6551724137931034, 'f1': 0.7169811320754716, 'number': 116} {'precision': 0.9116379310344828, 'recall': 0.9814385150812065, 'f1': 0.9452513966480447, 'number': 431} {'precision': 0.7209302325581395, 'recall': 0.9117647058823529, 'f1': 0.8051948051948051, 'number': 34} {'precision': 0.8945233265720081, 'recall': 0.9036885245901639, 'f1': 0.8990825688073394, 'number': 488} {'precision': 0.7073170731707317, 'recall': 0.7631578947368421, 'f1': 0.7341772151898733, 'number': 38} {'precision': 0.8557213930348259, 'recall': 0.8472906403940886, 'f1': 0.8514851485148515, 'number': 203} {'precision': 0.9354144241119483, 'recall': 0.989749430523918, 'f1': 0.9618151632540122, 'number': 878} {'precision': 0.7610294117647058, 'recall': 0.8884120171673819, 'f1': 0.8198019801980198, 'number': 233} 0.8854 0.9285 0.9065 0.9597
0.017 8.0 2464 0.2378 {'precision': 0.8620689655172413, 'recall': 0.646551724137931, 'f1': 0.7389162561576355, 'number': 116} {'precision': 0.9407744874715261, 'recall': 0.9582366589327146, 'f1': 0.9494252873563218, 'number': 431} {'precision': 0.7333333333333333, 'recall': 0.9705882352941176, 'f1': 0.8354430379746834, 'number': 34} {'precision': 0.8817635270541082, 'recall': 0.9016393442622951, 'f1': 0.8915906788247214, 'number': 488} {'precision': 0.6470588235294118, 'recall': 0.868421052631579, 'f1': 0.7415730337078651, 'number': 38} {'precision': 0.890625, 'recall': 0.8423645320197044, 'f1': 0.8658227848101266, 'number': 203} {'precision': 0.9254526091586794, 'recall': 0.989749430523918, 'f1': 0.9565217391304347, 'number': 878} {'precision': 0.7961538461538461, 'recall': 0.8884120171673819, 'f1': 0.8397565922920892, 'number': 233} 0.8921 0.9257 0.9086 0.9619
0.0091 9.0 2772 0.2489 {'precision': 0.84375, 'recall': 0.6982758620689655, 'f1': 0.7641509433962264, 'number': 116} {'precision': 0.9537037037037037, 'recall': 0.9559164733178654, 'f1': 0.9548088064889918, 'number': 431} {'precision': 0.7111111111111111, 'recall': 0.9411764705882353, 'f1': 0.8101265822784811, 'number': 34} {'precision': 0.8979591836734694, 'recall': 0.9016393442622951, 'f1': 0.899795501022495, 'number': 488} {'precision': 0.6739130434782609, 'recall': 0.8157894736842105, 'f1': 0.7380952380952381, 'number': 38} {'precision': 0.864321608040201, 'recall': 0.8472906403940886, 'f1': 0.8557213930348258, 'number': 203} {'precision': 0.9275825346112886, 'recall': 0.9920273348519362, 'f1': 0.9587231700605393, 'number': 878} {'precision': 0.806949806949807, 'recall': 0.8969957081545065, 'f1': 0.8495934959349594, 'number': 233} 0.8970 0.9285 0.9125 0.9624
0.0041 10.0 3080 0.2505 {'precision': 0.8421052631578947, 'recall': 0.6896551724137931, 'f1': 0.7582938388625593, 'number': 116} {'precision': 0.9427917620137299, 'recall': 0.9559164733178654, 'f1': 0.9493087557603687, 'number': 431} {'precision': 0.7272727272727273, 'recall': 0.9411764705882353, 'f1': 0.8205128205128205, 'number': 34} {'precision': 0.9036885245901639, 'recall': 0.9036885245901639, 'f1': 0.9036885245901639, 'number': 488} {'precision': 0.6521739130434783, 'recall': 0.7894736842105263, 'f1': 0.7142857142857143, 'number': 38} {'precision': 0.8557213930348259, 'recall': 0.8472906403940886, 'f1': 0.8514851485148515, 'number': 203} {'precision': 0.9295624332977588, 'recall': 0.9920273348519362, 'f1': 0.9597796143250688, 'number': 878} {'precision': 0.8022813688212928, 'recall': 0.9055793991416309, 'f1': 0.8508064516129031, 'number': 233} 0.8957 0.9290 0.9120 0.9623

Framework versions

  • Transformers 4.44.0
  • Pytorch 2.3.1+cu121
  • Datasets 2.19.1
  • Tokenizers 0.19.1
Downloads last month
0
Safetensors
Model size
559M params
Tensor type
F32
·
Inference API
Unable to determine this model's library. Check the docs .