File size: 3,274 Bytes
4ac390b 71b07e3 4ac390b e6764b0 e157526 f218035 f5dd9c5 41d0518 777d5dc 10f9f83 970a641 fec4faf 8708dda 5e0f426 fb018b6 de42ca1 aaa67df 3fb6775 a0263d6 3358e4f 44d4e29 974bcd6 b2acb67 cf4c6c7 e8cfbf5 19437d3 b600b90 0ef3107 c17d232 f796b5f 1354141 d539b96 e924aac fc85dbb 521cef9 b6b7cd6 d861991 1c4b5dd 70edd4d a1cf9cc bd583a0 5c3204d a452b8d 058907d b14c04a 6edce08 fe2fb46 1214b77 b0003bd 71b07e3 4ac390b |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 |
---
license: apache-2.0
base_model: facebook/bart-base
tags:
- generated_from_keras_callback
model-index:
- name: pijarcandra22/NMTIndoBaliBART
results: []
---
<!-- This model card has been generated automatically according to the information Keras had access to. You should
probably proofread and complete it, then remove this comment. -->
# pijarcandra22/NMTIndoBaliBART
This model is a fine-tuned version of [facebook/bart-base](https://huggingface.co/facebook/bart-base) on an unknown dataset.
It achieves the following results on the evaluation set:
- Train Loss: 5.5352
- Validation Loss: 5.5635
- Epoch: 47
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- optimizer: {'name': 'AdamWeightDecay', 'learning_rate': 0.02, 'decay': 0.0, 'beta_1': 0.9, 'beta_2': 0.999, 'epsilon': 1e-07, 'amsgrad': False, 'weight_decay_rate': 0.01}
- training_precision: float32
### Training results
| Train Loss | Validation Loss | Epoch |
|:----------:|:---------------:|:-----:|
| 9.7885 | 5.6003 | 0 |
| 5.5737 | 5.5523 | 1 |
| 5.5346 | 5.5361 | 2 |
| 5.5189 | 5.5283 | 3 |
| 5.5149 | 5.5252 | 4 |
| 5.5123 | 5.5233 | 5 |
| 5.5116 | 5.5485 | 6 |
| 5.5095 | 5.5314 | 7 |
| 5.5120 | 5.5569 | 8 |
| 5.5137 | 5.5239 | 9 |
| 5.5170 | 5.5289 | 10 |
| 5.5180 | 5.5298 | 11 |
| 5.5217 | 5.5513 | 12 |
| 5.5219 | 5.5344 | 13 |
| 5.5248 | 5.5366 | 14 |
| 5.5268 | 5.5493 | 15 |
| 5.5260 | 5.5313 | 16 |
| 5.5290 | 5.5462 | 17 |
| 5.5299 | 5.5570 | 18 |
| 5.5293 | 5.5480 | 19 |
| 5.5378 | 5.5524 | 20 |
| 5.5317 | 5.5740 | 21 |
| 5.5328 | 5.5543 | 22 |
| 5.5327 | 5.5537 | 23 |
| 5.5330 | 5.5356 | 24 |
| 5.5304 | 5.5492 | 25 |
| 5.5355 | 5.5388 | 26 |
| 5.5337 | 5.5812 | 27 |
| 5.5355 | 5.5598 | 28 |
| 5.5348 | 5.5489 | 29 |
| 5.5373 | 5.5526 | 30 |
| 5.5357 | 5.5575 | 31 |
| 5.5377 | 5.5439 | 32 |
| 5.5404 | 5.5367 | 33 |
| 5.5383 | 5.5819 | 34 |
| 5.5359 | 5.5815 | 35 |
| 5.5370 | 5.5499 | 36 |
| 5.5340 | 5.5622 | 37 |
| 5.5373 | 5.5667 | 38 |
| 5.5360 | 5.5548 | 39 |
| 5.5327 | 5.5555 | 40 |
| 5.5365 | 5.5642 | 41 |
| 5.5375 | 5.5496 | 42 |
| 5.5336 | 5.5424 | 43 |
| 5.5359 | 5.5761 | 44 |
| 5.5360 | 5.5821 | 45 |
| 5.5362 | 5.5742 | 46 |
| 5.5352 | 5.5635 | 47 |
### Framework versions
- Transformers 4.40.2
- TensorFlow 2.15.0
- Datasets 2.19.1
- Tokenizers 0.19.1
|