Edit model card

results_pegasus6_hiba_wiki

This model is a fine-tuned version of google/pegasus-large on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 0.0699
  • Rouge1 Fmeasure: 0.3654
  • Rouge2 Fmeasure: 0.2931
  • Rougel Fmeasure: 0.3466
  • Meteor: 0.2623
  • Bleu: 0.0000
  • Bertscore P: 0.9267
  • Bertscore R: 0.9204
  • Bertscore F1: 0.9235

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 4e-05
  • train_batch_size: 2
  • eval_batch_size: 2
  • seed: 42
  • gradient_accumulation_steps: 2
  • total_train_batch_size: 4
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_steps: 250
  • num_epochs: 10
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Rouge1 Fmeasure Rouge2 Fmeasure Rougel Fmeasure Meteor Bleu Bertscore P Bertscore R Bertscore F1
0.2433 0.5222 500 0.2175 0.0259 0.0060 0.0259 0.1659 0.0000 0.9282 0.8784 0.9026
0.2165 1.0444 1000 0.1818 0.0532 0.0247 0.0501 0.2343 0.0000 0.9405 0.9278 0.9341
0.1815 1.5666 1500 0.1535 0.1065 0.0529 0.0833 0.2365 0.0000 0.9369 0.9215 0.9291
0.1558 2.0888 2000 0.1352 0.1544 0.0836 0.1246 0.2442 0.0000 0.9342 0.9247 0.9294
0.1449 2.6110 2500 0.1250 0.1934 0.0940 0.1573 0.2439 0.0000 0.9338 0.9242 0.9289
0.1356 3.1332 3000 0.1107 0.2681 0.1641 0.2233 0.2477 0.0000 0.9259 0.9199 0.9229
0.1256 3.6554 3500 0.1063 0.2722 0.1793 0.2380 0.2497 0.0000 0.9263 0.9204 0.9233
0.1187 4.1775 4000 0.0975 0.2946 0.2058 0.2663 0.2534 0.0000 0.9272 0.9207 0.9239
0.1157 4.6997 4500 0.0936 0.2898 0.2032 0.2546 0.2456 0.0000 0.9289 0.9205 0.9247
0.1078 5.2219 5000 0.0876 0.3094 0.2179 0.2811 0.2521 0.0000 0.9272 0.9207 0.9239
0.1029 5.7441 5500 0.0849 0.3324 0.2481 0.3016 0.2571 0.0000 0.9267 0.9204 0.9235
0.1029 6.2663 6000 0.0827 0.3357 0.2546 0.3146 0.2575 0.0000 0.9284 0.9199 0.9241
0.0992 6.7885 6500 0.0783 0.3510 0.2765 0.3238 0.2606 0.0000 0.9268 0.9202 0.9234
0.0954 7.3107 7000 0.0754 0.3327 0.2611 0.3139 0.2588 0.0000 0.9267 0.9204 0.9235
0.0938 7.8329 7500 0.0746 0.3592 0.2863 0.3404 0.2614 0.0000 0.9300 0.9216 0.9258
0.0938 8.3551 8000 0.0728 0.3718 0.3022 0.3492 0.2613 0.0000 0.9268 0.9204 0.9236
0.09 8.8773 8500 0.0707 0.3600 0.2882 0.3442 0.2633 0.0000 0.9267 0.9204 0.9235
0.0911 9.3995 9000 0.0705 0.3677 0.2933 0.3464 0.2646 0.0000 0.9267 0.9204 0.9235
0.0881 9.9217 9500 0.0699 0.3654 0.2931 0.3466 0.2623 0.0000 0.9267 0.9204 0.9235

Framework versions

  • Transformers 4.42.0.dev0
  • Pytorch 2.3.0+cu121
  • Datasets 2.20.0
  • Tokenizers 0.19.1
Downloads last month
2
Safetensors
Model size
669M params
Tensor type
F32
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for hiba2/results_pegasus6_hiba_wiki

Finetuned
(54)
this model