roberta-large-ner-ghtk-cs-add-2label-16-new-data-3090-16Sep-1
This model was trained from scratch on the None dataset. It achieves the following results on the evaluation set:
- Loss: 0.2797
- Tk: {'precision': 0.8, 'recall': 0.7586206896551724, 'f1': 0.7787610619469026, 'number': 116}
- A: {'precision': 0.9242761692650334, 'recall': 0.962877030162413, 'f1': 0.9431818181818182, 'number': 431}
- Gày: {'precision': 0.7111111111111111, 'recall': 0.9411764705882353, 'f1': 0.8101265822784811, 'number': 34}
- Gày trừu tượng: {'precision': 0.9047619047619048, 'recall': 0.8954918032786885, 'f1': 0.9001029866117404, 'number': 488}
- Iờ: {'precision': 0.675, 'recall': 0.7105263157894737, 'f1': 0.6923076923076923, 'number': 38}
- Ã đơn: {'precision': 0.8300970873786407, 'recall': 0.8423645320197044, 'f1': 0.8361858190709045, 'number': 203}
- Đt: {'precision': 0.9442013129102844, 'recall': 0.9829157175398633, 'f1': 0.9631696428571428, 'number': 878}
- Đt trừu tượng: {'precision': 0.7364864864864865, 'recall': 0.9356223175965666, 'f1': 0.8241965973534972, 'number': 233}
- Overall Precision: 0.8852
- Overall Recall: 0.9298
- Overall F1: 0.9069
- Overall Accuracy: 0.9585
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2.5e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 10
Training results
Training Loss | Epoch | Step | Validation Loss | Tk | A | Gày | Gày trừu tượng | Iờ | Ã đơn | Đt | Đt trừu tượng | Overall Precision | Overall Recall | Overall F1 | Overall Accuracy |
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
No log | 1.0 | 298 | 0.2560 | {'precision': 0.640625, 'recall': 0.35344827586206895, 'f1': 0.45555555555555555, 'number': 116} | {'precision': 0.918141592920354, 'recall': 0.962877030162413, 'f1': 0.9399773499433748, 'number': 431} | {'precision': 0.45161290322580644, 'recall': 0.8235294117647058, 'f1': 0.5833333333333333, 'number': 34} | {'precision': 0.8821218074656189, 'recall': 0.9200819672131147, 'f1': 0.9007021063189569, 'number': 488} | {'precision': 0.5283018867924528, 'recall': 0.7368421052631579, 'f1': 0.6153846153846154, 'number': 38} | {'precision': 0.68359375, 'recall': 0.8620689655172413, 'f1': 0.7625272331154684, 'number': 203} | {'precision': 0.889348500517063, 'recall': 0.979498861047836, 'f1': 0.9322493224932249, 'number': 878} | {'precision': 0.5658914728682171, 'recall': 0.9399141630901288, 'f1': 0.7064516129032258, 'number': 233} | 0.8055 | 0.9149 | 0.8567 | 0.9384 |
0.0856 | 2.0 | 596 | 0.1970 | {'precision': 0.7951807228915663, 'recall': 0.5689655172413793, 'f1': 0.6633165829145728, 'number': 116} | {'precision': 0.9399538106235565, 'recall': 0.9443155452436195, 'f1': 0.9421296296296297, 'number': 431} | {'precision': 0.6530612244897959, 'recall': 0.9411764705882353, 'f1': 0.7710843373493975, 'number': 34} | {'precision': 0.9102296450939458, 'recall': 0.8934426229508197, 'f1': 0.9017580144777663, 'number': 488} | {'precision': 0.5901639344262295, 'recall': 0.9473684210526315, 'f1': 0.7272727272727273, 'number': 38} | {'precision': 0.8142857142857143, 'recall': 0.8423645320197044, 'f1': 0.828087167070218, 'number': 203} | {'precision': 0.9053069719042663, 'recall': 0.9908883826879271, 'f1': 0.9461663947797716, 'number': 878} | {'precision': 0.6380368098159509, 'recall': 0.8927038626609443, 'f1': 0.7441860465116278, 'number': 233} | 0.8555 | 0.9195 | 0.8863 | 0.9537 |
0.0856 | 3.0 | 894 | 0.2256 | {'precision': 0.8426966292134831, 'recall': 0.646551724137931, 'f1': 0.7317073170731707, 'number': 116} | {'precision': 0.9351230425055929, 'recall': 0.9698375870069605, 'f1': 0.9521640091116174, 'number': 431} | {'precision': 0.7435897435897436, 'recall': 0.8529411764705882, 'f1': 0.7945205479452054, 'number': 34} | {'precision': 0.8660194174757282, 'recall': 0.9139344262295082, 'f1': 0.8893320039880359, 'number': 488} | {'precision': 0.6382978723404256, 'recall': 0.7894736842105263, 'f1': 0.7058823529411764, 'number': 38} | {'precision': 0.9259259259259259, 'recall': 0.7389162561576355, 'f1': 0.8219178082191781, 'number': 203} | {'precision': 0.9369565217391305, 'recall': 0.9817767653758542, 'f1': 0.9588431590656284, 'number': 878} | {'precision': 0.6735395189003437, 'recall': 0.8412017167381974, 'f1': 0.7480916030534353, 'number': 233} | 0.8789 | 0.9112 | 0.8947 | 0.9565 |
0.0382 | 4.0 | 1192 | 0.2189 | {'precision': 0.7923076923076923, 'recall': 0.8879310344827587, 'f1': 0.8373983739837397, 'number': 116} | {'precision': 0.9682926829268292, 'recall': 0.9211136890951276, 'f1': 0.9441141498216409, 'number': 431} | {'precision': 0.625, 'recall': 0.7352941176470589, 'f1': 0.6756756756756757, 'number': 34} | {'precision': 0.8771929824561403, 'recall': 0.9221311475409836, 'f1': 0.899100899100899, 'number': 488} | {'precision': 0.5409836065573771, 'recall': 0.868421052631579, 'f1': 0.6666666666666666, 'number': 38} | {'precision': 0.8756476683937824, 'recall': 0.8325123152709359, 'f1': 0.8535353535353536, 'number': 203} | {'precision': 0.9674523007856342, 'recall': 0.9817767653758542, 'f1': 0.9745618993781798, 'number': 878} | {'precision': 0.6488095238095238, 'recall': 0.9356223175965666, 'f1': 0.7662565905096661, 'number': 233} | 0.8768 | 0.9323 | 0.9037 | 0.9575 |
0.0382 | 5.0 | 1490 | 0.2290 | {'precision': 0.7747747747747747, 'recall': 0.7413793103448276, 'f1': 0.7577092511013217, 'number': 116} | {'precision': 0.9312638580931264, 'recall': 0.974477958236659, 'f1': 0.9523809523809523, 'number': 431} | {'precision': 0.7441860465116279, 'recall': 0.9411764705882353, 'f1': 0.8311688311688312, 'number': 34} | {'precision': 0.8997955010224948, 'recall': 0.9016393442622951, 'f1': 0.9007164790174003, 'number': 488} | {'precision': 0.7857142857142857, 'recall': 0.5789473684210527, 'f1': 0.6666666666666667, 'number': 38} | {'precision': 0.9106145251396648, 'recall': 0.8029556650246306, 'f1': 0.8534031413612566, 'number': 203} | {'precision': 0.9614512471655329, 'recall': 0.9658314350797267, 'f1': 0.9636363636363636, 'number': 878} | {'precision': 0.7768924302788844, 'recall': 0.8369098712446352, 'f1': 0.8057851239669422, 'number': 233} | 0.9063 | 0.9112 | 0.9088 | 0.9616 |
0.025 | 6.0 | 1788 | 0.2494 | {'precision': 0.7338709677419355, 'recall': 0.7844827586206896, 'f1': 0.7583333333333333, 'number': 116} | {'precision': 0.9154013015184381, 'recall': 0.9791183294663574, 'f1': 0.946188340807175, 'number': 431} | {'precision': 0.7209302325581395, 'recall': 0.9117647058823529, 'f1': 0.8051948051948051, 'number': 34} | {'precision': 0.893574297188755, 'recall': 0.9118852459016393, 'f1': 0.9026369168356998, 'number': 488} | {'precision': 0.65, 'recall': 0.6842105263157895, 'f1': 0.6666666666666667, 'number': 38} | {'precision': 0.8608247422680413, 'recall': 0.8226600985221675, 'f1': 0.8413098236775819, 'number': 203} | {'precision': 0.9474260679079957, 'recall': 0.9851936218678815, 'f1': 0.9659408151870463, 'number': 878} | {'precision': 0.7709923664122137, 'recall': 0.8669527896995708, 'f1': 0.8161616161616161, 'number': 233} | 0.8872 | 0.9290 | 0.9076 | 0.9596 |
0.0136 | 7.0 | 2086 | 0.2645 | {'precision': 0.8, 'recall': 0.7586206896551724, 'f1': 0.7787610619469026, 'number': 116} | {'precision': 0.9321266968325792, 'recall': 0.9559164733178654, 'f1': 0.9438717067583048, 'number': 431} | {'precision': 0.7333333333333333, 'recall': 0.9705882352941176, 'f1': 0.8354430379746834, 'number': 34} | {'precision': 0.9036885245901639, 'recall': 0.9036885245901639, 'f1': 0.9036885245901639, 'number': 488} | {'precision': 0.6829268292682927, 'recall': 0.7368421052631579, 'f1': 0.7088607594936709, 'number': 38} | {'precision': 0.8877005347593583, 'recall': 0.8177339901477833, 'f1': 0.8512820512820511, 'number': 203} | {'precision': 0.9551569506726457, 'recall': 0.9703872437357631, 'f1': 0.9627118644067797, 'number': 878} | {'precision': 0.7233333333333334, 'recall': 0.9313304721030042, 'f1': 0.8142589118198874, 'number': 233} | 0.8930 | 0.9240 | 0.9082 | 0.9598 |
0.0136 | 8.0 | 2384 | 0.2769 | {'precision': 0.7857142857142857, 'recall': 0.7586206896551724, 'f1': 0.7719298245614034, 'number': 116} | {'precision': 0.9193899782135077, 'recall': 0.9791183294663574, 'f1': 0.9483146067415731, 'number': 431} | {'precision': 0.7333333333333333, 'recall': 0.9705882352941176, 'f1': 0.8354430379746834, 'number': 34} | {'precision': 0.9016393442622951, 'recall': 0.9016393442622951, 'f1': 0.9016393442622952, 'number': 488} | {'precision': 0.6078431372549019, 'recall': 0.8157894736842105, 'f1': 0.6966292134831461, 'number': 38} | {'precision': 0.7882882882882883, 'recall': 0.8620689655172413, 'f1': 0.823529411764706, 'number': 203} | {'precision': 0.9495060373216246, 'recall': 0.9851936218678815, 'f1': 0.9670206819452208, 'number': 878} | {'precision': 0.7147540983606557, 'recall': 0.9356223175965666, 'f1': 0.8104089219330856, 'number': 233} | 0.8762 | 0.9385 | 0.9063 | 0.9562 |
0.0086 | 9.0 | 2682 | 0.2666 | {'precision': 0.8, 'recall': 0.7586206896551724, 'f1': 0.7787610619469026, 'number': 116} | {'precision': 0.9285714285714286, 'recall': 0.9651972157772621, 'f1': 0.9465301478953356, 'number': 431} | {'precision': 0.7380952380952381, 'recall': 0.9117647058823529, 'f1': 0.8157894736842106, 'number': 34} | {'precision': 0.9070247933884298, 'recall': 0.8995901639344263, 'f1': 0.9032921810699589, 'number': 488} | {'precision': 0.625, 'recall': 0.6578947368421053, 'f1': 0.6410256410256411, 'number': 38} | {'precision': 0.8808290155440415, 'recall': 0.8374384236453202, 'f1': 0.8585858585858585, 'number': 203} | {'precision': 0.944994499449945, 'recall': 0.9783599088838268, 'f1': 0.9613878007834359, 'number': 878} | {'precision': 0.776173285198556, 'recall': 0.9227467811158798, 'f1': 0.8431372549019608, 'number': 233} | 0.8961 | 0.9265 | 0.9110 | 0.9607 |
0.0086 | 10.0 | 2980 | 0.2797 | {'precision': 0.8, 'recall': 0.7586206896551724, 'f1': 0.7787610619469026, 'number': 116} | {'precision': 0.9242761692650334, 'recall': 0.962877030162413, 'f1': 0.9431818181818182, 'number': 431} | {'precision': 0.7111111111111111, 'recall': 0.9411764705882353, 'f1': 0.8101265822784811, 'number': 34} | {'precision': 0.9047619047619048, 'recall': 0.8954918032786885, 'f1': 0.9001029866117404, 'number': 488} | {'precision': 0.675, 'recall': 0.7105263157894737, 'f1': 0.6923076923076923, 'number': 38} | {'precision': 0.8300970873786407, 'recall': 0.8423645320197044, 'f1': 0.8361858190709045, 'number': 203} | {'precision': 0.9442013129102844, 'recall': 0.9829157175398633, 'f1': 0.9631696428571428, 'number': 878} | {'precision': 0.7364864864864865, 'recall': 0.9356223175965666, 'f1': 0.8241965973534972, 'number': 233} | 0.8852 | 0.9298 | 0.9069 | 0.9585 |
Framework versions
- Transformers 4.44.0
- Pytorch 2.3.1+cu121
- Datasets 2.19.1
- Tokenizers 0.19.1
- Downloads last month
- 0