layoutlm-funsd
This model is a fine-tuned version of microsoft/layoutlm-base-uncased on the funsd dataset. It achieves the following results on the evaluation set:
- Loss: 1.0784
- Answer: {'precision': 0.39729990356798456, 'recall': 0.5092707045735476, 'f1': 0.44637053087757317, 'number': 809}
- Header: {'precision': 0.2601626016260163, 'recall': 0.2689075630252101, 'f1': 0.2644628099173554, 'number': 119}
- Question: {'precision': 0.5115384615384615, 'recall': 0.6244131455399061, 'f1': 0.5623678646934461, 'number': 1065}
- Overall Precision: 0.4508
- Overall Recall: 0.5564
- Overall F1: 0.4981
- Overall Accuracy: 0.6275
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 16
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 15
- mixed_precision_training: Native AMP
Training results
Training Loss | Epoch | Step | Validation Loss | Answer | Header | Question | Overall Precision | Overall Recall | Overall F1 | Overall Accuracy |
---|---|---|---|---|---|---|---|---|---|---|
1.7434 | 1.0 | 10 | 1.5471 | {'precision': 0.05161290322580645, 'recall': 0.03955500618046971, 'f1': 0.04478656403079076, 'number': 809} | {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 119} | {'precision': 0.27050359712230215, 'recall': 0.17652582159624414, 'f1': 0.21363636363636365, 'number': 1065} | 0.1673 | 0.1104 | 0.1330 | 0.3331 |
1.4356 | 2.0 | 20 | 1.3534 | {'precision': 0.21909424724602203, 'recall': 0.44252163164400493, 'f1': 0.29308227589029884, 'number': 809} | {'precision': 0.04411764705882353, 'recall': 0.025210084033613446, 'f1': 0.0320855614973262, 'number': 119} | {'precision': 0.2876712328767123, 'recall': 0.39436619718309857, 'f1': 0.33267326732673264, 'number': 1065} | 0.2470 | 0.3919 | 0.3030 | 0.4281 |
1.2677 | 3.0 | 30 | 1.2046 | {'precision': 0.24696645253390434, 'recall': 0.4276885043263288, 'f1': 0.31312217194570136, 'number': 809} | {'precision': 0.20652173913043478, 'recall': 0.15966386554621848, 'f1': 0.1800947867298578, 'number': 119} | {'precision': 0.3335304553518628, 'recall': 0.5295774647887324, 'f1': 0.40928882438316405, 'number': 1065} | 0.2918 | 0.4661 | 0.3589 | 0.4856 |
1.1386 | 4.0 | 40 | 1.1240 | {'precision': 0.28816326530612246, 'recall': 0.4363411619283066, 'f1': 0.34709931170108166, 'number': 809} | {'precision': 0.1875, 'recall': 0.17647058823529413, 'f1': 0.1818181818181818, 'number': 119} | {'precision': 0.3801391524351676, 'recall': 0.564319248826291, 'f1': 0.45427059712774, 'number': 1065} | 0.3341 | 0.4892 | 0.3971 | 0.5567 |
1.0425 | 5.0 | 50 | 1.0865 | {'precision': 0.31069609507640067, 'recall': 0.45241038318912236, 'f1': 0.3683945646703573, 'number': 809} | {'precision': 0.25609756097560976, 'recall': 0.17647058823529413, 'f1': 0.208955223880597, 'number': 119} | {'precision': 0.41022364217252394, 'recall': 0.6028169014084507, 'f1': 0.4882129277566539, 'number': 1065} | 0.3642 | 0.5163 | 0.4271 | 0.5740 |
1.0051 | 6.0 | 60 | 1.0745 | {'precision': 0.3435185185185185, 'recall': 0.45859085290482077, 'f1': 0.39280042350449973, 'number': 809} | {'precision': 0.22448979591836735, 'recall': 0.18487394957983194, 'f1': 0.20276497695852533, 'number': 119} | {'precision': 0.48720066061106526, 'recall': 0.5539906103286385, 'f1': 0.5184534270650263, 'number': 1065} | 0.4115 | 0.4932 | 0.4487 | 0.5916 |
0.9533 | 7.0 | 70 | 1.0560 | {'precision': 0.329126213592233, 'recall': 0.41903584672435107, 'f1': 0.36867862969004894, 'number': 809} | {'precision': 0.22950819672131148, 'recall': 0.23529411764705882, 'f1': 0.23236514522821577, 'number': 119} | {'precision': 0.41502463054187194, 'recall': 0.6328638497652582, 'f1': 0.5013015991074748, 'number': 1065} | 0.375 | 0.5223 | 0.4366 | 0.5919 |
0.8838 | 8.0 | 80 | 1.0296 | {'precision': 0.3531047265987025, 'recall': 0.47095179233621753, 'f1': 0.4036016949152542, 'number': 809} | {'precision': 0.211864406779661, 'recall': 0.21008403361344538, 'f1': 0.2109704641350211, 'number': 119} | {'precision': 0.45523941707147814, 'recall': 0.615962441314554, 'f1': 0.5235434956105347, 'number': 1065} | 0.4026 | 0.5329 | 0.4586 | 0.6141 |
0.8148 | 9.0 | 90 | 1.0582 | {'precision': 0.38949454905847375, 'recall': 0.4857849196538937, 'f1': 0.43234323432343236, 'number': 809} | {'precision': 0.2571428571428571, 'recall': 0.226890756302521, 'f1': 0.24107142857142855, 'number': 119} | {'precision': 0.5230125523012552, 'recall': 0.5868544600938967, 'f1': 0.5530973451327434, 'number': 1065} | 0.4526 | 0.5243 | 0.4858 | 0.6139 |
0.8139 | 10.0 | 100 | 1.0429 | {'precision': 0.37296260786193675, 'recall': 0.48084054388133496, 'f1': 0.42008639308855295, 'number': 809} | {'precision': 0.24786324786324787, 'recall': 0.24369747899159663, 'f1': 0.24576271186440676, 'number': 119} | {'precision': 0.46943078004216443, 'recall': 0.6272300469483568, 'f1': 0.5369774919614148, 'number': 1065} | 0.4204 | 0.5449 | 0.4747 | 0.6247 |
0.7228 | 11.0 | 110 | 1.0542 | {'precision': 0.38454106280193234, 'recall': 0.4919653893695921, 'f1': 0.4316702819956616, 'number': 809} | {'precision': 0.2702702702702703, 'recall': 0.25210084033613445, 'f1': 0.2608695652173913, 'number': 119} | {'precision': 0.5042536736272235, 'recall': 0.612206572769953, 'f1': 0.5530110262934691, 'number': 1065} | 0.4428 | 0.5419 | 0.4874 | 0.6257 |
0.7193 | 12.0 | 120 | 1.0835 | {'precision': 0.3971563981042654, 'recall': 0.5179233621755254, 'f1': 0.4495708154506438, 'number': 809} | {'precision': 0.26126126126126126, 'recall': 0.24369747899159663, 'f1': 0.25217391304347825, 'number': 119} | {'precision': 0.5153664302600472, 'recall': 0.6140845070422535, 'f1': 0.5604113110539846, 'number': 1065} | 0.4526 | 0.5529 | 0.4977 | 0.6268 |
0.687 | 13.0 | 130 | 1.0892 | {'precision': 0.4001823154056518, 'recall': 0.5426452410383189, 'f1': 0.4606505771248688, 'number': 809} | {'precision': 0.25742574257425743, 'recall': 0.2184873949579832, 'f1': 0.23636363636363636, 'number': 119} | {'precision': 0.5263157894736842, 'recall': 0.5915492957746479, 'f1': 0.5570291777188329, 'number': 1065} | 0.4572 | 0.5494 | 0.4991 | 0.6255 |
0.6515 | 14.0 | 140 | 1.0795 | {'precision': 0.398635477582846, 'recall': 0.5055624227441285, 'f1': 0.4457765667574932, 'number': 809} | {'precision': 0.25862068965517243, 'recall': 0.25210084033613445, 'f1': 0.25531914893617025, 'number': 119} | {'precision': 0.5205047318611987, 'recall': 0.6197183098591549, 'f1': 0.5657951135876553, 'number': 1065} | 0.4560 | 0.5514 | 0.4992 | 0.6262 |
0.6453 | 15.0 | 150 | 1.0784 | {'precision': 0.39729990356798456, 'recall': 0.5092707045735476, 'f1': 0.44637053087757317, 'number': 809} | {'precision': 0.2601626016260163, 'recall': 0.2689075630252101, 'f1': 0.2644628099173554, 'number': 119} | {'precision': 0.5115384615384615, 'recall': 0.6244131455399061, 'f1': 0.5623678646934461, 'number': 1065} | 0.4508 | 0.5564 | 0.4981 | 0.6275 |
Framework versions
- Transformers 4.38.2
- Pytorch 2.2.1+cu121
- Datasets 2.18.0
- Tokenizers 0.15.2
- Downloads last month
- 1
Inference API (serverless) is not available, repository is disabled.
Model tree for pabloma09/layoutlm-funsd
Base model
microsoft/layoutlm-base-uncased
Finetuned
this model