File size: 6,641 Bytes
774258b
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
5356e69
 
 
774258b
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
7307b6e
5d3a54a
131df81
e5d273d
c1c9a0e
5cb0ab3
81e2736
586e0db
0c81e70
2681f61
3e4b798
465da56
0563229
c37918e
fd269a4
58e555c
54c0016
4661b8a
46106ce
c9cdad3
41b71e4
c600ea8
3ee53c2
dc87674
d3d71c9
aa0d827
0c1dd8f
448b850
b0cd3c6
f300909
c115ef3
b1424c9
30d8396
2276888
8fd18de
722a80b
f20dd53
fc631da
81f1d25
2942078
70c0472
1649ea0
3765098
5f0964a
3afaae8
3708377
6d104d9
c5214ce
7eab052
18f8e4d
cbbcac8
2ed34f0
127939a
93b8523
48b6338
2bee3f5
028139e
5828c95
34a9346
c6e65c1
69e3a11
402eb35
172a318
ac57a2b
f94312d
bbfeda0
4718b3f
94170d2
14a1dd8
33665c3
0d24cbf
d669180
6c79c2e
49b348a
146c387
be65e8c
d2b2d18
48addd9
38a3044
04e286d
c534e7d
bfd03e6
503c9dd
266dc78
17fffa7
7a5e46f
662ba84
e6abbde
f38c14a
fb9df87
0df303f
b7bd8cf
5bba782
cc5c861
bb01bd1
8b57c57
2b3a6c6
da961de
461cddc
2e4d106
1fb83bb
9738fd0
3e761eb
5647fe7
0d0c539
f42396c
28137ba
9b8f072
e82443c
0fdc3aa
a42a20d
dc9ed08
ba11f10
fb5ed14
f4ec884
5a227d1
57041ed
11641c6
58ef8f0
c997c55
9f226d1
8e62c69
beec434
fc1b9eb
ae1ec99
e385cc8
c1814b7
cc98be4
2b38b7f
21d43f5
5356e69
774258b
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
---
license: apache-2.0
base_model: t5-small
tags:
- generated_from_keras_callback
model-index:
- name: pijarcandra22/t5Indo2Jawa
  results: []
---

<!-- This model card has been generated automatically according to the information Keras had access to. You should
probably proofread and complete it, then remove this comment. -->

# pijarcandra22/t5Indo2Jawa

This model is a fine-tuned version of [t5-small](https://huggingface.co/t5-small) on an unknown dataset.
It achieves the following results on the evaluation set:
- Train Loss: 1.6101
- Validation Loss: 1.5534
- Epoch: 130

## Model description

More information needed

## Intended uses & limitations

More information needed

## Training and evaluation data

More information needed

## Training procedure

### Training hyperparameters

The following hyperparameters were used during training:
- optimizer: {'name': 'AdamWeightDecay', 'learning_rate': 2e-05, 'decay': 0.0, 'beta_1': 0.9, 'beta_2': 0.999, 'epsilon': 1e-07, 'amsgrad': False, 'weight_decay_rate': 0.01}
- training_precision: float32

### Training results

| Train Loss | Validation Loss | Epoch |
|:----------:|:---------------:|:-----:|
| 3.5149     | 3.1567          | 0     |
| 3.3816     | 3.0397          | 1     |
| 3.2812     | 2.9518          | 2     |
| 3.1977     | 2.8751          | 3     |
| 3.1223     | 2.8078          | 4     |
| 3.0599     | 2.7507          | 5     |
| 3.0019     | 2.6979          | 6     |
| 2.9517     | 2.6513          | 7     |
| 2.9034     | 2.6121          | 8     |
| 2.8638     | 2.5756          | 9     |
| 2.8232     | 2.5391          | 10    |
| 2.7856     | 2.5089          | 11    |
| 2.7541     | 2.4786          | 12    |
| 2.7219     | 2.4499          | 13    |
| 2.6935     | 2.4256          | 14    |
| 2.6658     | 2.4010          | 15    |
| 2.6389     | 2.3762          | 16    |
| 2.6143     | 2.3550          | 17    |
| 2.5899     | 2.3313          | 18    |
| 2.5665     | 2.3156          | 19    |
| 2.5445     | 2.2939          | 20    |
| 2.5224     | 2.2750          | 21    |
| 2.5022     | 2.2569          | 22    |
| 2.4834     | 2.2410          | 23    |
| 2.4641     | 2.2220          | 24    |
| 2.4443     | 2.2091          | 25    |
| 2.4267     | 2.1948          | 26    |
| 2.4129     | 2.1796          | 27    |
| 2.3937     | 2.1657          | 28    |
| 2.3782     | 2.1523          | 29    |
| 2.3616     | 2.1385          | 30    |
| 2.3471     | 2.1267          | 31    |
| 2.3351     | 2.1110          | 32    |
| 2.3184     | 2.0988          | 33    |
| 2.3047     | 2.0871          | 34    |
| 2.2920     | 2.0768          | 35    |
| 2.2767     | 2.0649          | 36    |
| 2.2651     | 2.0546          | 37    |
| 2.2526     | 2.0445          | 38    |
| 2.2388     | 2.0333          | 39    |
| 2.2264     | 2.0234          | 40    |
| 2.2157     | 2.0165          | 41    |
| 2.2050     | 2.0049          | 42    |
| 2.1906     | 1.9946          | 43    |
| 2.1824     | 1.9845          | 44    |
| 2.1673     | 1.9762          | 45    |
| 2.1559     | 1.9679          | 46    |
| 2.1455     | 1.9608          | 47    |
| 2.1377     | 1.9528          | 48    |
| 2.1279     | 1.9429          | 49    |
| 2.1176     | 1.9356          | 50    |
| 2.1056     | 1.9267          | 51    |
| 2.0979     | 1.9174          | 52    |
| 2.0882     | 1.9087          | 53    |
| 2.0802     | 1.8995          | 54    |
| 2.0668     | 1.8947          | 55    |
| 2.0597     | 1.8880          | 56    |
| 2.0484     | 1.8779          | 57    |
| 2.0405     | 1.8735          | 58    |
| 2.0335     | 1.8676          | 59    |
| 2.0254     | 1.8603          | 60    |
| 2.0147     | 1.8530          | 61    |
| 2.0078     | 1.8459          | 62    |
| 1.9984     | 1.8403          | 63    |
| 1.9902     | 1.8338          | 64    |
| 1.9824     | 1.8264          | 65    |
| 1.9768     | 1.8231          | 66    |
| 1.9679     | 1.8158          | 67    |
| 1.9597     | 1.8104          | 68    |
| 1.9531     | 1.8026          | 69    |
| 1.9460     | 1.7987          | 70    |
| 1.9416     | 1.7929          | 71    |
| 1.9291     | 1.7876          | 72    |
| 1.9245     | 1.7807          | 73    |
| 1.9143     | 1.7788          | 74    |
| 1.9088     | 1.7717          | 75    |
| 1.9006     | 1.7643          | 76    |
| 1.8960     | 1.7587          | 77    |
| 1.8901     | 1.7528          | 78    |
| 1.8808     | 1.7477          | 79    |
| 1.8740     | 1.7436          | 80    |
| 1.8689     | 1.7376          | 81    |
| 1.8628     | 1.7320          | 82    |
| 1.8533     | 1.7312          | 83    |
| 1.8486     | 1.7240          | 84    |
| 1.8428     | 1.7186          | 85    |
| 1.8351     | 1.7141          | 86    |
| 1.8316     | 1.7106          | 87    |
| 1.8234     | 1.7045          | 88    |
| 1.8173     | 1.6976          | 89    |
| 1.8109     | 1.6959          | 90    |
| 1.8059     | 1.6924          | 91    |
| 1.8016     | 1.6860          | 92    |
| 1.7922     | 1.6802          | 93    |
| 1.7887     | 1.6778          | 94    |
| 1.7832     | 1.6716          | 95    |
| 1.7761     | 1.6688          | 96    |
| 1.7724     | 1.6653          | 97    |
| 1.7662     | 1.6582          | 98    |
| 1.7607     | 1.6571          | 99    |
| 1.7549     | 1.6542          | 100   |
| 1.7483     | 1.6497          | 101   |
| 1.7454     | 1.6435          | 102   |
| 1.7400     | 1.6407          | 103   |
| 1.7318     | 1.6363          | 104   |
| 1.7266     | 1.6327          | 105   |
| 1.7234     | 1.6286          | 106   |
| 1.7210     | 1.6267          | 107   |
| 1.7109     | 1.6207          | 108   |
| 1.7079     | 1.6183          | 109   |
| 1.7026     | 1.6162          | 110   |
| 1.6989     | 1.6137          | 111   |
| 1.6925     | 1.6074          | 112   |
| 1.6880     | 1.6051          | 113   |
| 1.6823     | 1.6021          | 114   |
| 1.6780     | 1.5969          | 115   |
| 1.6737     | 1.5960          | 116   |
| 1.6659     | 1.5937          | 117   |
| 1.6603     | 1.5872          | 118   |
| 1.6586     | 1.5870          | 119   |
| 1.6550     | 1.5813          | 120   |
| 1.6506     | 1.5788          | 121   |
| 1.6432     | 1.5771          | 122   |
| 1.6408     | 1.5721          | 123   |
| 1.6377     | 1.5729          | 124   |
| 1.6307     | 1.5693          | 125   |
| 1.6268     | 1.5650          | 126   |
| 1.6227     | 1.5607          | 127   |
| 1.6180     | 1.5618          | 128   |
| 1.6151     | 1.5590          | 129   |
| 1.6101     | 1.5534          | 130   |


### Framework versions

- Transformers 4.35.2
- TensorFlow 2.14.0
- Datasets 2.15.0
- Tokenizers 0.15.0