distilhubert-finetuned-gtzan
This model is a fine-tuned version of ntu-spml/distilhubert on the GTZAN dataset. It achieves the following results on the evaluation set:
- Loss: 1.7615
- Accuracy: 0.85
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 50
- mixed_precision_training: Native AMP
Training results
Training Loss | Epoch | Step | Validation Loss | Accuracy |
---|---|---|---|---|
2.2914 | 1.0 | 57 | 2.2595 | 0.25 |
2.1225 | 2.0 | 114 | 2.0265 | 0.57 |
1.7631 | 3.0 | 171 | 1.6482 | 0.59 |
1.3445 | 4.0 | 228 | 1.3380 | 0.62 |
1.1548 | 5.0 | 285 | 1.0589 | 0.72 |
0.9289 | 6.0 | 342 | 0.8541 | 0.76 |
0.708 | 7.0 | 399 | 0.7628 | 0.79 |
0.4497 | 8.0 | 456 | 0.7088 | 0.82 |
0.4061 | 9.0 | 513 | 0.6118 | 0.85 |
0.286 | 10.0 | 570 | 0.6684 | 0.79 |
0.1739 | 11.0 | 627 | 0.5965 | 0.83 |
0.1103 | 12.0 | 684 | 0.8414 | 0.81 |
0.0922 | 13.0 | 741 | 0.5937 | 0.87 |
0.0166 | 14.0 | 798 | 0.5786 | 0.86 |
0.0075 | 15.0 | 855 | 0.7950 | 0.84 |
0.0014 | 16.0 | 912 | 0.8492 | 0.87 |
0.0006 | 17.0 | 969 | 1.2642 | 0.82 |
0.0815 | 18.0 | 1026 | 1.1173 | 0.87 |
0.0 | 19.0 | 1083 | 1.2181 | 0.86 |
0.0 | 20.0 | 1140 | 1.6673 | 0.85 |
0.0 | 21.0 | 1197 | 1.4749 | 0.86 |
0.0611 | 22.0 | 1254 | 2.2533 | 0.82 |
0.0978 | 23.0 | 1311 | 2.0092 | 0.86 |
0.0 | 24.0 | 1368 | 2.3586 | 0.83 |
0.0 | 25.0 | 1425 | 1.7617 | 0.86 |
0.0 | 26.0 | 1482 | 1.7425 | 0.86 |
0.0 | 27.0 | 1539 | 1.8418 | 0.85 |
0.0 | 28.0 | 1596 | 1.6987 | 0.87 |
0.0 | 29.0 | 1653 | 1.9399 | 0.85 |
0.0 | 30.0 | 1710 | 2.4230 | 0.81 |
0.0 | 31.0 | 1767 | 1.4312 | 0.88 |
0.1807 | 32.0 | 1824 | 1.5278 | 0.87 |
0.0 | 33.0 | 1881 | 1.3795 | 0.88 |
0.0 | 34.0 | 1938 | 1.5051 | 0.88 |
0.0 | 35.0 | 1995 | 1.6587 | 0.85 |
0.0 | 36.0 | 2052 | 1.6256 | 0.86 |
0.0 | 37.0 | 2109 | 1.7290 | 0.85 |
0.0 | 38.0 | 2166 | 1.8676 | 0.87 |
0.0 | 39.0 | 2223 | 1.8963 | 0.86 |
0.166 | 40.0 | 2280 | 1.7057 | 0.85 |
0.1293 | 41.0 | 2337 | 1.4235 | 0.87 |
0.1491 | 42.0 | 2394 | 1.7916 | 0.85 |
0.1416 | 43.0 | 2451 | 1.8634 | 0.85 |
0.0 | 44.0 | 2508 | 1.6286 | 0.86 |
0.0526 | 45.0 | 2565 | 1.6242 | 0.86 |
0.0 | 46.0 | 2622 | 1.7576 | 0.85 |
0.0 | 47.0 | 2679 | 1.7897 | 0.85 |
0.0 | 48.0 | 2736 | 1.7571 | 0.85 |
0.0018 | 49.0 | 2793 | 1.6993 | 0.85 |
0.0 | 50.0 | 2850 | 1.7615 | 0.85 |
Framework versions
- Transformers 4.42.3
- Pytorch 2.1.2
- Datasets 2.20.0
- Tokenizers 0.19.1
- Downloads last month
- 6
Model tree for nithin04/distilhubert-finetuned-gtzan
Base model
ntu-spml/distilhubert