Edit model card

lmv2-2022-05-24

This model is a fine-tuned version of microsoft/layoutlmv2-base-uncased on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.0484
  • Address Precision: 0.9474
  • Address Recall: 1.0
  • Address F1: 0.9730
  • Address Number: 18
  • Business Name Precision: 1.0
  • Business Name Recall: 1.0
  • Business Name F1: 1.0
  • Business Name Number: 13
  • City State Zip Code Precision: 0.8947
  • City State Zip Code Recall: 0.8947
  • City State Zip Code F1: 0.8947
  • City State Zip Code Number: 19
  • Ein Precision: 1.0
  • Ein Recall: 1.0
  • Ein F1: 1.0
  • Ein Number: 4
  • List Account Number Precision: 0.6
  • List Account Number Recall: 0.75
  • List Account Number F1: 0.6667
  • List Account Number Number: 4
  • Name Precision: 1.0
  • Name Recall: 0.9444
  • Name F1: 0.9714
  • Name Number: 18
  • Ssn Precision: 1.0
  • Ssn Recall: 1.0
  • Ssn F1: 1.0
  • Ssn Number: 8
  • Overall Precision: 0.9412
  • Overall Recall: 0.9524
  • Overall F1: 0.9467
  • Overall Accuracy: 0.9979

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 4e-05
  • train_batch_size: 1
  • eval_batch_size: 1
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: constant
  • num_epochs: 30

Training results

Training Loss Epoch Step Validation Loss Address Precision Address Recall Address F1 Address Number Business Name Precision Business Name Recall Business Name F1 Business Name Number City State Zip Code Precision City State Zip Code Recall City State Zip Code F1 City State Zip Code Number Ein Precision Ein Recall Ein F1 Ein Number List Account Number Precision List Account Number Recall List Account Number F1 List Account Number Number Name Precision Name Recall Name F1 Name Number Ssn Precision Ssn Recall Ssn F1 Ssn Number Overall Precision Overall Recall Overall F1 Overall Accuracy
1.9388 1.0 79 1.5568 0.0 0.0 0.0 18 0.0 0.0 0.0 13 0.0 0.0 0.0 19 0.0 0.0 0.0 4 0.0 0.0 0.0 4 0.0 0.0 0.0 18 0.0 0.0 0.0 8 0.0 0.0 0.0 0.9465
1.3777 2.0 158 1.1259 0.0 0.0 0.0 18 0.0 0.0 0.0 13 0.0 0.0 0.0 19 0.0 0.0 0.0 4 0.0 0.0 0.0 4 0.0 0.0 0.0 18 0.0 0.0 0.0 8 0.0 0.0 0.0 0.9465
0.9629 3.0 237 0.7497 0.0 0.0 0.0 18 0.0 0.0 0.0 13 0.0 0.0 0.0 19 0.0 0.0 0.0 4 0.0 0.0 0.0 4 0.0 0.0 0.0 18 0.0 0.0 0.0 8 0.0 0.0 0.0 0.9465
0.6292 4.0 316 0.4818 0.0 0.0 0.0 18 0.0 0.0 0.0 13 0.0 0.0 0.0 19 0.0 0.0 0.0 4 0.0 0.0 0.0 4 0.0 0.0 0.0 18 0.1944 0.875 0.3182 8 0.1944 0.0833 0.1167 0.9523
0.3952 5.0 395 0.2982 0.2424 0.8889 0.3810 18 0.0 0.0 0.0 13 0.1111 0.1053 0.1081 19 0.0 0.0 0.0 4 0.0 0.0 0.0 4 0.0 0.0 0.0 18 0.6364 0.875 0.7368 8 0.2632 0.2976 0.2793 0.9660
0.2675 6.0 474 0.2183 1.0 0.9444 0.9714 18 0.0 0.0 0.0 13 0.8824 0.7895 0.8333 19 0.0 0.0 0.0 4 0.6 0.75 0.6667 4 0.1905 0.4444 0.2667 18 0.5714 1.0 0.7273 8 0.5204 0.6071 0.5604 0.9810
0.2095 7.0 553 0.1990 1.0 0.9444 0.9714 18 0.0833 0.0769 0.08 13 0.9375 0.7895 0.8571 19 0.0 0.0 0.0 4 0.75 0.75 0.75 4 0.2647 0.5 0.3462 18 0.1739 1.0 0.2963 8 0.4109 0.6310 0.4977 0.9762
0.1928 8.0 632 0.1704 1.0 0.9444 0.9714 18 0.3158 0.4615 0.3750 13 0.9412 0.8421 0.8889 19 0.0 0.0 0.0 4 1.0 0.75 0.8571 4 0.3214 0.5 0.3913 18 0.5385 0.875 0.6667 8 0.5979 0.6905 0.6409 0.9849
0.159 9.0 711 0.1339 1.0 0.9444 0.9714 18 0.45 0.6923 0.5455 13 0.9444 0.8947 0.9189 19 0.0 0.0 0.0 4 0.25 0.75 0.375 4 0.375 0.5 0.4286 18 0.2308 0.375 0.2857 8 0.5577 0.6905 0.6170 0.9871
0.1314 10.0 790 0.1199 0.9444 0.9444 0.9444 18 0.8571 0.9231 0.8889 13 0.9444 0.8947 0.9189 19 0.0 0.0 0.0 4 0.6 0.75 0.6667 4 0.7895 0.8333 0.8108 18 0.6667 1.0 0.8 8 0.8372 0.8571 0.8471 0.9897
0.1143 11.0 869 0.1127 0.9444 0.9444 0.9444 18 1.0 1.0 1.0 13 0.9444 0.8947 0.9189 19 0.0 0.0 0.0 4 0.6 0.75 0.6667 4 1.0 0.9444 0.9714 18 0.6667 1.0 0.8 8 0.9036 0.8929 0.8982 0.9903
0.1037 12.0 948 0.1039 0.85 0.9444 0.8947 18 0.9167 0.8462 0.8800 13 0.9444 0.8947 0.9189 19 0.0 0.0 0.0 4 0.6 0.75 0.6667 4 0.8889 0.8889 0.8889 18 0.6667 1.0 0.8 8 0.8471 0.8571 0.8521 0.9901
0.0925 13.0 1027 0.1124 1.0 0.9444 0.9714 18 1.0 1.0 1.0 13 0.9444 0.8947 0.9189 19 0.0 0.0 0.0 4 0.75 0.75 0.75 4 1.0 0.9444 0.9714 18 0.5833 0.875 0.7000 8 0.9136 0.8810 0.8970 0.9904
0.0863 14.0 1106 0.1077 0.9444 0.9444 0.9444 18 0.7333 0.8462 0.7857 13 0.9444 0.8947 0.9189 19 0.0 0.0 0.0 4 0.6 0.75 0.6667 4 1.0 0.9444 0.9714 18 0.6154 1.0 0.7619 8 0.8488 0.8690 0.8588 0.9916
0.0845 15.0 1185 0.1035 0.9444 0.9444 0.9444 18 1.0 1.0 1.0 13 0.9412 0.8421 0.8889 19 0.0 0.0 0.0 4 0.6 0.75 0.6667 4 1.0 0.9444 0.9714 18 0.5833 0.875 0.7000 8 0.8902 0.8690 0.8795 0.9921
0.0735 16.0 1264 0.0866 0.6667 0.8889 0.7619 18 1.0 1.0 1.0 13 0.9444 0.8947 0.9189 19 0.0 0.0 0.0 4 0.6 0.75 0.6667 4 1.0 0.9444 0.9714 18 0.6667 1.0 0.8 8 0.8315 0.8810 0.8555 0.9918
0.0714 17.0 1343 0.0781 0.9444 0.9444 0.9444 18 1.0 0.9231 0.9600 13 0.9412 0.8421 0.8889 19 0.0 0.0 0.0 4 0.6 0.75 0.6667 4 1.0 0.9444 0.9714 18 0.6667 1.0 0.8 8 0.9012 0.8690 0.8848 0.9921
0.0656 18.0 1422 0.0816 0.8947 0.9444 0.9189 18 1.0 1.0 1.0 13 0.9444 0.8947 0.9189 19 0.0 0.0 0.0 4 0.6 0.75 0.6667 4 0.9444 0.9444 0.9444 18 0.6667 1.0 0.8 8 0.8824 0.8929 0.8876 0.9919
0.0602 19.0 1501 0.0770 0.8 0.8889 0.8421 18 0.8667 1.0 0.9286 13 0.9444 0.8947 0.9189 19 0.0 0.0 0.0 4 0.6 0.75 0.6667 4 0.9444 0.9444 0.9444 18 0.6667 1.0 0.8 8 0.8409 0.8810 0.8605 0.9912
0.0516 20.0 1580 0.0710 0.8095 0.9444 0.8718 18 1.0 1.0 1.0 13 0.9444 0.8947 0.9189 19 0.0 0.0 0.0 4 0.6 0.75 0.6667 4 1.0 0.9444 0.9714 18 0.6667 1.0 0.8 8 0.8721 0.8929 0.8824 0.9919
0.0475 21.0 1659 0.0686 0.6667 1.0 0.8 18 0.5 0.6154 0.5517 13 0.9412 0.8421 0.8889 19 0.0 0.0 0.0 4 0.6 0.75 0.6667 4 0.9412 0.8889 0.9143 18 0.6667 1.0 0.8 8 0.7340 0.8214 0.7753 0.9904
0.0431 22.0 1738 0.0715 0.8095 0.9444 0.8718 18 0.9286 1.0 0.9630 13 0.8421 0.8421 0.8421 19 0.0 0.0 0.0 4 0.75 0.75 0.75 4 0.9444 0.9444 0.9444 18 0.3529 0.75 0.48 8 0.7273 0.8571 0.7869 0.9933
0.0383 23.0 1817 0.0627 0.8947 0.9444 0.9189 18 0.9231 0.9231 0.9231 13 0.8947 0.8947 0.8947 19 0.0 0.0 0.0 4 0.75 0.75 0.75 4 1.0 0.8889 0.9412 18 0.5714 1.0 0.7273 8 0.8111 0.8690 0.8391 0.9961
0.0327 24.0 1896 0.0683 0.8095 0.9444 0.8718 18 0.6 0.9231 0.7273 13 0.8095 0.8947 0.8500 19 0.6 0.75 0.6667 4 0.75 0.75 0.75 4 0.9412 0.8889 0.9143 18 0.8889 1.0 0.9412 8 0.7835 0.9048 0.8398 0.9942
0.0292 25.0 1975 0.0674 0.8947 0.9444 0.9189 18 1.0 1.0 1.0 13 0.85 0.8947 0.8718 19 1.0 1.0 1.0 4 0.6 0.75 0.6667 4 1.0 0.9444 0.9714 18 1.0 1.0 1.0 8 0.9186 0.9405 0.9294 0.9975
0.0269 26.0 2054 0.0691 0.85 0.9444 0.8947 18 1.0 1.0 1.0 13 0.9444 0.8947 0.9189 19 1.0 1.0 1.0 4 0.6 0.75 0.6667 4 1.0 0.9444 0.9714 18 1.0 1.0 1.0 8 0.9294 0.9405 0.9349 0.9976
0.024 27.0 2133 0.0484 0.9474 1.0 0.9730 18 1.0 1.0 1.0 13 0.8947 0.8947 0.8947 19 1.0 1.0 1.0 4 0.6 0.75 0.6667 4 1.0 0.9444 0.9714 18 1.0 1.0 1.0 8 0.9412 0.9524 0.9467 0.9979
0.0221 28.0 2212 0.0619 0.85 0.9444 0.8947 18 1.0 1.0 1.0 13 0.9444 0.8947 0.9189 19 1.0 1.0 1.0 4 0.6 0.75 0.6667 4 1.0 0.9444 0.9714 18 1.0 1.0 1.0 8 0.9294 0.9405 0.9349 0.9976
0.0216 29.0 2291 0.0810 0.85 0.9444 0.8947 18 1.0 1.0 1.0 13 0.9444 0.8947 0.9189 19 1.0 1.0 1.0 4 0.6 0.75 0.6667 4 1.0 0.9444 0.9714 18 1.0 0.875 0.9333 8 0.9286 0.9286 0.9286 0.9960
0.0175 30.0 2370 0.0646 0.85 0.9444 0.8947 18 1.0 1.0 1.0 13 0.9444 0.8947 0.9189 19 1.0 1.0 1.0 4 0.6 0.75 0.6667 4 1.0 0.9444 0.9714 18 1.0 1.0 1.0 8 0.9294 0.9405 0.9349 0.9976

Framework versions

  • Transformers 4.20.0.dev0
  • Pytorch 1.11.0+cu113
  • Datasets 2.2.2
  • Tokenizers 0.12.1
Downloads last month
1
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.