Edit model card

nyt-ingredient-tagger-gte-base

This model is a fine-tuned version of thenlper/gte-base on the nyt_ingredients dataset. It achieves the following results on the evaluation set:

  • Loss: 0.8715
  • Comment: {'precision': 0.6872832369942197, 'recall': 0.8166208791208791, 'f1': 0.7463904582548649, 'number': 7280}
  • Name: {'precision': 0.8038948801172652, 'recall': 0.8268360973508507, 'f1': 0.8152041195519455, 'number': 9286}
  • Qty: {'precision': 0.9873535676251332, 'recall': 0.9860409465567668, 'f1': 0.9866968205401091, 'number': 7522}
  • Range End: {'precision': 0.6285714285714286, 'recall': 0.9361702127659575, 'f1': 0.7521367521367521, 'number': 94}
  • Unit: {'precision': 0.9322007236117665, 'recall': 0.9777264477808942, 'f1': 0.9544210017716219, 'number': 6061}
  • Overall Precision: 0.8399
  • Overall Recall: 0.8946
  • Overall F1: 0.8664
  • Overall Accuracy: 0.8432

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 5e-05
  • train_batch_size: 32
  • eval_batch_size: 32
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 5
  • label_smoothing_factor: 0.1

Training results

Training Loss Epoch Step Validation Loss Comment Name Qty Range End Unit Overall Precision Overall Recall Overall F1 Overall Accuracy
0.944 0.2 1000 0.9202 {'precision': 0.6120679415607554, 'recall': 0.7544655929721815, 'f1': 0.675847596563709, 'number': 6830} {'precision': 0.7844396082698586, 'recall': 0.8165137614678899, 'f1': 0.8001553915311616, 'number': 8829} {'precision': 0.9741153793198403, 'recall': 0.9908963585434174, 'f1': 0.9824342150940777, 'number': 7140} {'precision': 0.6214285714285714, 'recall': 0.925531914893617, 'f1': 0.7435897435897435, 'number': 94} {'precision': 0.9131349077968295, 'recall': 0.9861984626135569, 'f1': 0.9482613808163951, 'number': 5724} 0.8069 0.8795 0.8416 0.8220
0.9154 0.4 2000 0.9014 {'precision': 0.6404628414648694, 'recall': 0.7860907759882869, 'f1': 0.7058436863209098, 'number': 6830} {'precision': 0.7931676338552502, 'recall': 0.8204779703250651, 'f1': 0.8065916935753257, 'number': 8829} {'precision': 0.9789502838941975, 'recall': 0.9900560224089636, 'f1': 0.9844718334377829, 'number': 7140} {'precision': 0.5947712418300654, 'recall': 0.9680851063829787, 'f1': 0.7368421052631579, 'number': 94} {'precision': 0.9199088393293179, 'recall': 0.9872466806429071, 'f1': 0.9523889778377012, 'number': 5724} 0.8193 0.8884 0.8524 0.8306
0.9086 0.59 3000 0.8954 {'precision': 0.6559519604380077, 'recall': 0.8156661786237189, 'f1': 0.7271422045291392, 'number': 6830} {'precision': 0.7948717948717948, 'recall': 0.8216106014271152, 'f1': 0.8080200501253132, 'number': 8829} {'precision': 0.975865397876155, 'recall': 0.9910364145658264, 'f1': 0.9833923980265444, 'number': 7140} {'precision': 0.6376811594202898, 'recall': 0.9361702127659575, 'f1': 0.7586206896551724, 'number': 94} {'precision': 0.9211516440372975, 'recall': 0.9837526205450734, 'f1': 0.9514235025766664, 'number': 5724} 0.8232 0.8953 0.8577 0.8333
0.8962 0.79 4000 0.8914 {'precision': 0.6682174338883448, 'recall': 0.799121522693997, 'f1': 0.7278303773836512, 'number': 6830} {'precision': 0.7972380534853135, 'recall': 0.8238758636312153, 'f1': 0.8103381050520805, 'number': 8829} {'precision': 0.9793685959567987, 'recall': 0.9906162464985995, 'f1': 0.984960311934271, 'number': 7140} {'precision': 0.6484375, 'recall': 0.8829787234042553, 'f1': 0.7477477477477479, 'number': 94} {'precision': 0.9248070924314562, 'recall': 0.9841020265548568, 'f1': 0.9535336436732967, 'number': 5724} 0.8304 0.8918 0.8600 0.8360
0.8891 0.99 5000 0.8838 {'precision': 0.680119581464873, 'recall': 0.7994143484626647, 'f1': 0.7349575986000808, 'number': 6830} {'precision': 0.7980727113447219, 'recall': 0.8254615471740854, 'f1': 0.8115361060074606, 'number': 8829} {'precision': 0.9788352469221192, 'recall': 0.9910364145658264, 'f1': 0.9848980444011414, 'number': 7140} {'precision': 0.6615384615384615, 'recall': 0.9148936170212766, 'f1': 0.7678571428571428, 'number': 94} {'precision': 0.9254220619570562, 'recall': 0.9863731656184487, 'f1': 0.9549260042283298, 'number': 5724} 0.8346 0.8930 0.8628 0.8389
0.8641 1.19 6000 0.8874 {'precision': 0.6708014106773683, 'recall': 0.8076134699853587, 'f1': 0.732877167342058, 'number': 6830} {'precision': 0.8106220201796208, 'recall': 0.8280665987088005, 'f1': 0.8192514567458539, 'number': 8829} {'precision': 0.9736480922316771, 'recall': 0.9935574229691877, 'f1': 0.9835020102592541, 'number': 7140} {'precision': 0.625, 'recall': 0.9574468085106383, 'f1': 0.7563025210084033, 'number': 94} {'precision': 0.9219081849371018, 'recall': 0.9858490566037735, 'f1': 0.9528070915998311, 'number': 5724} 0.8331 0.8965 0.8636 0.8376
0.8716 1.39 7000 0.8784 {'precision': 0.6831488314883148, 'recall': 0.8131771595900439, 'f1': 0.7425133689839571, 'number': 6830} {'precision': 0.8091653752490591, 'recall': 0.8279533355985955, 'f1': 0.8184515478922914, 'number': 8829} {'precision': 0.9784470848300636, 'recall': 0.9918767507002801, 'f1': 0.9851161496731118, 'number': 7140} {'precision': 0.6717557251908397, 'recall': 0.9361702127659575, 'f1': 0.7822222222222222, 'number': 94} {'precision': 0.9253166639249877, 'recall': 0.9827044025157232, 'f1': 0.9531475048716428, 'number': 5724} 0.8382 0.8966 0.8664 0.8420
0.8613 1.58 8000 0.8823 {'precision': 0.6763180118228979, 'recall': 0.8207906295754026, 'f1': 0.7415834380580726, 'number': 6830} {'precision': 0.8007234462347912, 'recall': 0.8273870200475705, 'f1': 0.8138368983957219, 'number': 8829} {'precision': 0.9789793942746509, 'recall': 0.9914565826330533, 'f1': 0.9851784844478463, 'number': 7140} {'precision': 0.6164383561643836, 'recall': 0.9574468085106383, 'f1': 0.7500000000000001, 'number': 94} {'precision': 0.9251152073732719, 'recall': 0.9820055904961565, 'f1': 0.9527118644067797, 'number': 5724} 0.8327 0.8981 0.8642 0.8402
0.8744 1.78 9000 0.8788 {'precision': 0.6842490842490843, 'recall': 0.820497803806735, 'f1': 0.7462050599201066, 'number': 6830} {'precision': 0.8025666337611056, 'recall': 0.8287461773700305, 'f1': 0.8154463390170511, 'number': 8829} {'precision': 0.9823513062812673, 'recall': 0.9900560224089636, 'f1': 0.9861886160714286, 'number': 7140} {'precision': 0.6818181818181818, 'recall': 0.9574468085106383, 'f1': 0.7964601769911505, 'number': 94} {'precision': 0.922613229064842, 'recall': 0.9893431167016072, 'f1': 0.954813690777272, 'number': 5724} 0.8365 0.8996 0.8669 0.8423
0.8644 1.98 10000 0.8753 {'precision': 0.6871112216969395, 'recall': 0.8086383601756955, 'f1': 0.7429378531073446, 'number': 6830} {'precision': 0.8013368397983782, 'recall': 0.8282931249292106, 'f1': 0.8145920356446671, 'number': 8829} {'precision': 0.9827490261547023, 'recall': 0.9893557422969188, 'f1': 0.9860413176996092, 'number': 7140} {'precision': 0.696, 'recall': 0.925531914893617, 'f1': 0.7945205479452053, 'number': 94} {'precision': 0.9273927392739274, 'recall': 0.9818308874912649, 'f1': 0.9538357094365241, 'number': 5724} 0.8386 0.8948 0.8658 0.8416
0.8374 2.18 11000 0.8823 {'precision': 0.6925305454087417, 'recall': 0.8049780380673499, 'f1': 0.7445324666531248, 'number': 6830} {'precision': 0.7997152869031976, 'recall': 0.8271604938271605, 'f1': 0.8132063916263014, 'number': 8829} {'precision': 0.9837115411388, 'recall': 0.9896358543417367, 'f1': 0.9866648048593173, 'number': 7140} {'precision': 0.6541353383458647, 'recall': 0.925531914893617, 'f1': 0.7665198237885463, 'number': 94} {'precision': 0.9254637990477754, 'recall': 0.9848008385744235, 'f1': 0.9542107490478207, 'number': 5724} 0.8397 0.8943 0.8661 0.8415
0.8363 2.38 12000 0.8869 {'precision': 0.6901128892196998, 'recall': 0.8144948755490483, 'f1': 0.7471627157343361, 'number': 6830} {'precision': 0.8069826538504032, 'recall': 0.8272737569373655, 'f1': 0.8170022371364652, 'number': 8829} {'precision': 0.982363560616581, 'recall': 0.9907563025210084, 'f1': 0.98654208214211, 'number': 7140} {'precision': 0.6825396825396826, 'recall': 0.9148936170212766, 'f1': 0.7818181818181817, 'number': 94} {'precision': 0.926649163111257, 'recall': 0.9865478686233403, 'f1': 0.9556608563208665, 'number': 5724} 0.8409 0.8972 0.8681 0.8439
0.8336 2.57 13000 0.8826 {'precision': 0.6855172413793104, 'recall': 0.8004392386530015, 'f1': 0.7385342789598108, 'number': 6830} {'precision': 0.8088072582429741, 'recall': 0.8279533355985955, 'f1': 0.8182683158896289, 'number': 8829} {'precision': 0.9803296855520155, 'recall': 0.9911764705882353, 'f1': 0.985723239779929, 'number': 7140} {'precision': 0.6641221374045801, 'recall': 0.925531914893617, 'f1': 0.7733333333333333, 'number': 94} {'precision': 0.9256767842493847, 'recall': 0.9856743535988819, 'f1': 0.954733903037482, 'number': 5724} 0.8399 0.8940 0.8661 0.8416
0.8369 2.77 14000 0.8762 {'precision': 0.6956186560452268, 'recall': 0.792679355783309, 'f1': 0.740984055293232, 'number': 6830} {'precision': 0.8046496253856324, 'recall': 0.8271604938271605, 'f1': 0.8157497905612958, 'number': 8829} {'precision': 0.980484429065744, 'recall': 0.9921568627450981, 'f1': 0.9862861120779673, 'number': 7140} {'precision': 0.6666666666666666, 'recall': 0.9361702127659575, 'f1': 0.7787610619469028, 'number': 94} {'precision': 0.9256225425950196, 'recall': 0.9870719776380154, 'f1': 0.9553601623266824, 'number': 5724} 0.8423 0.8924 0.8667 0.8428
0.8402 2.97 15000 0.8754 {'precision': 0.6976949237939287, 'recall': 0.8109809663250366, 'f1': 0.7500846367391156, 'number': 6830} {'precision': 0.805531167690957, 'recall': 0.8313512289047458, 'f1': 0.8182375564349813, 'number': 8829} {'precision': 0.9835905993603115, 'recall': 0.9906162464985995, 'f1': 0.9870909217779638, 'number': 7140} {'precision': 0.7456140350877193, 'recall': 0.9042553191489362, 'f1': 0.8173076923076923, 'number': 94} {'precision': 0.9257255287752091, 'recall': 0.9863731656184487, 'f1': 0.9550875412331896, 'number': 5724} 0.8433 0.8975 0.8695 0.8452
0.8036 3.17 16000 0.8847 {'precision': 0.6951158106747231, 'recall': 0.8084919472913616, 'f1': 0.7475294436171653, 'number': 6830} {'precision': 0.8062272275587943, 'recall': 0.8270472307169555, 'f1': 0.8165045286816504, 'number': 8829} {'precision': 0.9823660094418217, 'recall': 0.9908963585434174, 'f1': 0.9866127457816204, 'number': 7140} {'precision': 0.7304347826086957, 'recall': 0.8936170212765957, 'f1': 0.8038277511961722, 'number': 94} {'precision': 0.9269773547751887, 'recall': 0.9868972746331237, 'f1': 0.9559993230665087, 'number': 5724} 0.8428 0.8957 0.8685 0.8448
0.8037 3.37 17000 0.8834 {'precision': 0.6962266548087918, 'recall': 0.8023426061493412, 'f1': 0.7455275151350249, 'number': 6830} {'precision': 0.8004607283896445, 'recall': 0.8264809151659305, 'f1': 0.8132627472833659, 'number': 8829} {'precision': 0.9810328118510314, 'recall': 0.992436974789916, 'f1': 0.9867019424911231, 'number': 7140} {'precision': 0.7327586206896551, 'recall': 0.9042553191489362, 'f1': 0.8095238095238094, 'number': 94} {'precision': 0.92678512668641, 'recall': 0.9841020265548568, 'f1': 0.9545839688188443, 'number': 5724} 0.8414 0.8939 0.8668 0.8431
0.8108 3.56 18000 0.8876 {'precision': 0.694210724601281, 'recall': 0.8093704245973645, 'f1': 0.7473805178124789, 'number': 6830} {'precision': 0.800065803904365, 'recall': 0.8262543889455204, 'f1': 0.8129492394272023, 'number': 8829} {'precision': 0.9796764827872252, 'recall': 0.992436974789916, 'f1': 0.9860154456272177, 'number': 7140} {'precision': 0.704, 'recall': 0.9361702127659575, 'f1': 0.8036529680365296, 'number': 94} {'precision': 0.9266250820748523, 'recall': 0.9861984626135569, 'f1': 0.9554840893703453, 'number': 5724} 0.8399 0.8960 0.8670 0.8441
0.806 3.76 19000 0.8833 {'precision': 0.7025162856048026, 'recall': 0.8052708638360175, 'f1': 0.7503922504945766, 'number': 6830} {'precision': 0.8099712580145921, 'recall': 0.8298788084720806, 'f1': 0.819804195804196, 'number': 8829} {'precision': 0.9814353006372957, 'recall': 0.9921568627450981, 'f1': 0.9867669591865162, 'number': 7140} {'precision': 0.7297297297297297, 'recall': 0.8617021276595744, 'f1': 0.7902439024390245, 'number': 94} {'precision': 0.9282059745832646, 'recall': 0.9825296995108316, 'f1': 0.9545956038360349, 'number': 5724} 0.8464 0.8951 0.8701 0.8454
0.8118 3.96 20000 0.8805 {'precision': 0.6981322564361434, 'recall': 0.8099560761346999, 'f1': 0.7498983326555511, 'number': 6830} {'precision': 0.8089974577207915, 'recall': 0.8289727035904406, 'f1': 0.8188632803759232, 'number': 8829} {'precision': 0.9818282702177833, 'recall': 0.9913165266106443, 'f1': 0.9865495853369572, 'number': 7140} {'precision': 0.6692307692307692, 'recall': 0.925531914893617, 'f1': 0.7767857142857142, 'number': 94} {'precision': 0.9277960526315789, 'recall': 0.9854996505939903, 'f1': 0.9557777024737377, 'number': 5724} 0.8443 0.8966 0.8696 0.8433
0.7792 4.16 21000 0.8955 {'precision': 0.6977390425666288, 'recall': 0.8087847730600293, 'f1': 0.7491693225740829, 'number': 6830} {'precision': 0.8072502210433244, 'recall': 0.8272737569373655, 'f1': 0.8171393410527494, 'number': 8829} {'precision': 0.981563626282229, 'recall': 0.9917366946778712, 'f1': 0.9866239375783753, 'number': 7140} {'precision': 0.696, 'recall': 0.925531914893617, 'f1': 0.7945205479452053, 'number': 94} {'precision': 0.9261447562776958, 'recall': 0.9858490566037735, 'f1': 0.955064737242955, 'number': 5724} 0.8435 0.8959 0.8689 0.8438
0.7844 4.36 22000 0.8965 {'precision': 0.6992586912065439, 'recall': 0.8010248901903367, 'f1': 0.7466903234611709, 'number': 6830} {'precision': 0.8035242290748899, 'recall': 0.8263676520557255, 'f1': 0.8147858618571668, 'number': 8829} {'precision': 0.9807559185933823, 'recall': 0.9921568627450981, 'f1': 0.9864234491401519, 'number': 7140} {'precision': 0.7107438016528925, 'recall': 0.9148936170212766, 'f1': 0.7999999999999999, 'number': 94} {'precision': 0.9251187940357202, 'recall': 0.9863731656184487, 'f1': 0.9547645218567684, 'number': 5724} 0.8429 0.8940 0.8677 0.8420
0.7783 4.55 23000 0.8986 {'precision': 0.6973415132924335, 'recall': 0.7988286969253294, 'f1': 0.7446431008598335, 'number': 6830} {'precision': 0.8045444517979263, 'recall': 0.8261411258353154, 'f1': 0.8151997764738753, 'number': 8829} {'precision': 0.9825169973636743, 'recall': 0.9917366946778712, 'f1': 0.9871053181849865, 'number': 7140} {'precision': 0.688, 'recall': 0.9148936170212766, 'f1': 0.7853881278538812, 'number': 94} {'precision': 0.9252948885976409, 'recall': 0.986722571628232, 'f1': 0.9550219817382481, 'number': 5724} 0.8430 0.8934 0.8674 0.8422
0.784 4.75 24000 0.8966 {'precision': 0.6979669631512071, 'recall': 0.8042459736456808, 'f1': 0.7473469387755102, 'number': 6830} {'precision': 0.8029870415110916, 'recall': 0.8281798618190056, 'f1': 0.8153889043769166, 'number': 8829} {'precision': 0.9821131447587355, 'recall': 0.9920168067226891, 'f1': 0.9870401337792643, 'number': 7140} {'precision': 0.6935483870967742, 'recall': 0.9148936170212766, 'f1': 0.7889908256880733, 'number': 94} {'precision': 0.9265648102513554, 'recall': 0.9853249475890985, 'f1': 0.9550419100838201, 'number': 5724} 0.8426 0.8951 0.8680 0.8443
0.776 4.95 25000 0.8964 {'precision': 0.6977393954787909, 'recall': 0.8043923865300147, 'f1': 0.7472796517954298, 'number': 6830} {'precision': 0.8060392329733304, 'recall': 0.8284063880394156, 'f1': 0.8170697648438808, 'number': 8829} {'precision': 0.9822419533851277, 'recall': 0.9915966386554622, 'f1': 0.9868971285196543, 'number': 7140} {'precision': 0.6991869918699187, 'recall': 0.9148936170212766, 'f1': 0.7926267281105991, 'number': 94} {'precision': 0.92616899097621, 'recall': 0.9861984626135569, 'f1': 0.9552415601996784, 'number': 5724} 0.8435 0.8952 0.8686 0.8442

Framework versions

  • Transformers 4.34.0
  • Pytorch 2.0.1+cu118
  • Datasets 2.14.5
  • Tokenizers 0.14.1
Downloads last month
10
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for napsternxg/nyt-ingredient-tagger-gte-base

Base model

thenlper/gte-base
Finetuned
(11)
this model

Dataset used to train napsternxg/nyt-ingredient-tagger-gte-base