Edit model card

layoutlm-funsd

This model is a fine-tuned version of microsoft/layoutlm-base-uncased on the funsd dataset. It achieves the following results on the evaluation set:

  • Loss: 0.7107
  • Answer: {'precision': 0.696078431372549, 'recall': 0.7898640296662547, 'f1': 0.740011580775912, 'number': 809}
  • Header: {'precision': 0.3333333333333333, 'recall': 0.3277310924369748, 'f1': 0.3305084745762712, 'number': 119}
  • Question: {'precision': 0.7799827437446074, 'recall': 0.8488262910798122, 'f1': 0.8129496402877696, 'number': 1065}
  • Overall Precision: 0.7211
  • Overall Recall: 0.7938
  • Overall F1: 0.7557
  • Overall Accuracy: 0.8008

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 3e-05
  • train_batch_size: 16
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 15

Training results

Training Loss Epoch Step Validation Loss Answer Header Question Overall Precision Overall Recall Overall F1 Overall Accuracy
1.8202 1.0 10 1.5835 {'precision': 0.025120772946859903, 'recall': 0.032138442521631644, 'f1': 0.028199566160520606, 'number': 809} {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 119} {'precision': 0.20365535248041775, 'recall': 0.21971830985915494, 'f1': 0.2113821138211382, 'number': 1065} 0.1190 0.1305 0.1245 0.3882
1.4316 2.0 20 1.2683 {'precision': 0.16164383561643836, 'recall': 0.14585908529048208, 'f1': 0.1533463287849253, 'number': 809} {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 119} {'precision': 0.4461190655614167, 'recall': 0.5558685446009389, 'f1': 0.49498327759197325, 'number': 1065} 0.3452 0.3562 0.3506 0.5544
1.1076 3.0 30 0.9812 {'precision': 0.44790547798066593, 'recall': 0.515451174289246, 'f1': 0.4793103448275862, 'number': 809} {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 119} {'precision': 0.5667215815485996, 'recall': 0.6460093896713615, 'f1': 0.6037735849056605, 'number': 1065} 0.5087 0.5544 0.5306 0.6776
0.845 4.0 40 0.8144 {'precision': 0.5346083788706739, 'recall': 0.7255871446229913, 'f1': 0.615626638699528, 'number': 809} {'precision': 0.029411764705882353, 'recall': 0.01680672268907563, 'f1': 0.021390374331550797, 'number': 119} {'precision': 0.6285482562854826, 'recall': 0.7276995305164319, 'f1': 0.6744995648389904, 'number': 1065} 0.5686 0.6844 0.6211 0.7435
0.6785 5.0 50 0.7277 {'precision': 0.6069651741293532, 'recall': 0.754017305315204, 'f1': 0.6725468577728776, 'number': 809} {'precision': 0.29850746268656714, 'recall': 0.16806722689075632, 'f1': 0.21505376344086022, 'number': 119} {'precision': 0.6952296819787986, 'recall': 0.7389671361502348, 'f1': 0.7164314974965864, 'number': 1065} 0.6429 0.7110 0.6752 0.7748
0.5677 6.0 60 0.6958 {'precision': 0.6477987421383647, 'recall': 0.7639060568603214, 'f1': 0.7010777084515031, 'number': 809} {'precision': 0.29347826086956524, 'recall': 0.226890756302521, 'f1': 0.2559241706161137, 'number': 119} {'precision': 0.7031375703942075, 'recall': 0.8206572769953052, 'f1': 0.7573656845753899, 'number': 1065} 0.6636 0.7622 0.7095 0.7865
0.4873 7.0 70 0.6717 {'precision': 0.6487046632124353, 'recall': 0.7737948084054388, 'f1': 0.705749718151071, 'number': 809} {'precision': 0.31313131313131315, 'recall': 0.2605042016806723, 'f1': 0.28440366972477066, 'number': 119} {'precision': 0.7402707275803723, 'recall': 0.8215962441314554, 'f1': 0.778816199376947, 'number': 1065} 0.6821 0.7687 0.7228 0.7916
0.4374 8.0 80 0.6668 {'precision': 0.6781115879828327, 'recall': 0.7812113720642769, 'f1': 0.7260195290063183, 'number': 809} {'precision': 0.3173076923076923, 'recall': 0.2773109243697479, 'f1': 0.29596412556053814, 'number': 119} {'precision': 0.743142144638404, 'recall': 0.8394366197183099, 'f1': 0.7883597883597885, 'number': 1065} 0.6963 0.7822 0.7368 0.7951
0.3894 9.0 90 0.6763 {'precision': 0.6646341463414634, 'recall': 0.8084054388133498, 'f1': 0.7295036252091467, 'number': 809} {'precision': 0.3465346534653465, 'recall': 0.29411764705882354, 'f1': 0.3181818181818182, 'number': 119} {'precision': 0.7589833479404031, 'recall': 0.8131455399061033, 'f1': 0.7851314596554851, 'number': 1065} 0.6986 0.7802 0.7371 0.7977
0.3492 10.0 100 0.6727 {'precision': 0.6944140197152245, 'recall': 0.7836835599505563, 'f1': 0.7363530778164925, 'number': 809} {'precision': 0.32231404958677684, 'recall': 0.3277310924369748, 'f1': 0.32499999999999996, 'number': 119} {'precision': 0.762350936967632, 'recall': 0.8403755868544601, 'f1': 0.7994640464493078, 'number': 1065} 0.7101 0.7868 0.7465 0.7993
0.3166 11.0 110 0.6756 {'precision': 0.6956055734190782, 'recall': 0.8022249690976514, 'f1': 0.7451205510907002, 'number': 809} {'precision': 0.30833333333333335, 'recall': 0.31092436974789917, 'f1': 0.3096234309623431, 'number': 119} {'precision': 0.7672047578589635, 'recall': 0.847887323943662, 'f1': 0.8055307760927745, 'number': 1065} 0.7126 0.7973 0.7525 0.8022
0.2962 12.0 120 0.7018 {'precision': 0.6983783783783784, 'recall': 0.7985166872682324, 'f1': 0.7450980392156863, 'number': 809} {'precision': 0.32231404958677684, 'recall': 0.3277310924369748, 'f1': 0.32499999999999996, 'number': 119} {'precision': 0.7716468590831919, 'recall': 0.8535211267605634, 'f1': 0.8105216228265715, 'number': 1065} 0.7167 0.7998 0.7560 0.8007
0.2861 13.0 130 0.7136 {'precision': 0.6987041036717062, 'recall': 0.799752781211372, 'f1': 0.745821325648415, 'number': 809} {'precision': 0.3305084745762712, 'recall': 0.3277310924369748, 'f1': 0.32911392405063294, 'number': 119} {'precision': 0.7859007832898173, 'recall': 0.847887323943662, 'f1': 0.8157181571815718, 'number': 1065} 0.7246 0.7973 0.7592 0.8021
0.2685 14.0 140 0.7143 {'precision': 0.6964091403699674, 'recall': 0.7911001236093943, 'f1': 0.7407407407407407, 'number': 809} {'precision': 0.325, 'recall': 0.3277310924369748, 'f1': 0.3263598326359833, 'number': 119} {'precision': 0.7802768166089965, 'recall': 0.8469483568075117, 'f1': 0.8122467357046375, 'number': 1065} 0.7203 0.7933 0.7550 0.8003
0.2625 15.0 150 0.7107 {'precision': 0.696078431372549, 'recall': 0.7898640296662547, 'f1': 0.740011580775912, 'number': 809} {'precision': 0.3333333333333333, 'recall': 0.3277310924369748, 'f1': 0.3305084745762712, 'number': 119} {'precision': 0.7799827437446074, 'recall': 0.8488262910798122, 'f1': 0.8129496402877696, 'number': 1065} 0.7211 0.7938 0.7557 0.8008

Framework versions

  • Transformers 4.34.1
  • Pytorch 2.1.0+cu118
  • Datasets 2.14.6
  • Tokenizers 0.14.1
Downloads last month
3
Inference Examples
Inference API (serverless) is not available, repository is disabled.

Model tree for nishthaahuja25/layoutlm-funsd

Finetuned
this model