Edit model card

flan-t5-large-cars-descriptions

This model is a fine-tuned version of google/flan-t5-large on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 0.7687
  • Rouge1: 19.5278
  • Rouge2: 13.4042
  • Rougel: 17.2713
  • Rougelsum: 18.2289
  • Gen Len: 19.0

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 5e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 10

Training results

Training Loss Epoch Step Validation Loss Rouge1 Rouge2 Rougel Rougelsum Gen Len
No log 1.0 28 0.8985 20.106 13.8185 17.5397 19.1007 19.0
No log 2.0 56 0.8444 19.963 13.2848 17.4604 18.8592 19.0
No log 3.0 84 0.8163 19.9916 13.9678 17.8614 19.0133 19.0
No log 4.0 112 0.7918 19.8144 12.9257 17.5265 18.6706 19.0
No log 5.0 140 0.7859 19.6936 13.9027 17.548 18.4906 19.0
No log 6.0 168 0.7796 19.4132 13.4927 17.3465 18.1291 19.0
No log 7.0 196 0.7732 19.7803 13.3453 17.5581 18.4902 19.0
No log 8.0 224 0.7724 19.2412 12.4472 16.8241 17.8911 19.0
No log 9.0 252 0.7687 19.5278 13.4042 17.2713 18.2289 19.0
No log 10.0 280 0.7691 19.5748 13.3162 17.5093 18.4578 19.0

Framework versions

  • Transformers 4.35.1
  • Pytorch 2.1.0+cu118
  • Datasets 2.14.6
  • Tokenizers 0.14.1
Downloads last month
12
Safetensors
Model size
783M params
Tensor type
F32
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for fashxp/flan-t5-large-cars-descriptions

Finetuned
(102)
this model