Edit model card

ml_gen_seo_google_23_05_2024

This model was trained from scratch on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 0.7733

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 5e-05
  • train_batch_size: 4
  • eval_batch_size: 4
  • seed: 42
  • gradient_accumulation_steps: 2
  • total_train_batch_size: 8
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_steps: 50
  • num_epochs: 3

Training results

Training Loss Epoch Step Validation Loss
0.9707 0.1534 25 0.7829
0.942 0.3067 50 0.7871
0.8262 0.4601 75 0.7827
0.9281 0.6135 100 0.7894
0.9142 0.7669 125 0.7706
0.8757 0.9202 150 0.7701
0.8237 1.0736 175 0.7883
0.8219 1.2270 200 0.7684
0.8051 1.3804 225 0.7779
0.7711 1.5337 250 0.7831
0.8685 1.6871 275 0.7721
0.7802 1.8405 300 0.7804
0.778 1.9939 325 0.7812
0.7685 2.1472 350 0.7782
0.8233 2.3006 375 0.7678
0.7752 2.4540 400 0.7717
0.7144 2.6074 425 0.7722
0.7322 2.7607 450 0.7719
0.6849 2.9141 475 0.7733

Framework versions

  • Transformers 4.41.0
  • Pytorch 2.3.0+cu121
  • Tokenizers 0.19.1
Downloads last month
0
Safetensors
Model size
244M params
Tensor type
F32
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.