pszemraj commited on
Commit
f9ab922
1 Parent(s): 28ed651

update model card README.md

Browse files
Files changed (1) hide show
  1. README.md +70 -0
README.md ADDED
@@ -0,0 +1,70 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ tags:
3
+ - generated_from_trainer
4
+ metrics:
5
+ - rouge
6
+ model-index:
7
+ - name: tglobal-large-booksum-WIP5
8
+ results: []
9
+ ---
10
+
11
+ <!-- This model card has been generated automatically according to the information the Trainer had access to. You
12
+ should probably proofread and complete it, then remove this comment. -->
13
+
14
+ # tglobal-large-booksum-WIP5
15
+
16
+ This model is a fine-tuned version of [pszemraj/tglobal-large-booksum-WIP4](https://huggingface.co/pszemraj/tglobal-large-booksum-WIP4) on the None dataset.
17
+ It achieves the following results on the evaluation set:
18
+ - Loss: 4.9519
19
+ - Rouge1: 21.8058
20
+ - Rouge2: 2.9343
21
+ - Rougel: 10.3717
22
+ - Rougelsum: 20.1537
23
+ - Gen Len: 106.055
24
+
25
+ ## Model description
26
+
27
+ More information needed
28
+
29
+ ## Intended uses & limitations
30
+
31
+ More information needed
32
+
33
+ ## Training and evaluation data
34
+
35
+ More information needed
36
+
37
+ ## Training procedure
38
+
39
+ ### Training hyperparameters
40
+
41
+ The following hyperparameters were used during training:
42
+ - learning_rate: 0.0004
43
+ - train_batch_size: 1
44
+ - eval_batch_size: 1
45
+ - seed: 31060
46
+ - distributed_type: multi-GPU
47
+ - num_devices: 4
48
+ - gradient_accumulation_steps: 32
49
+ - total_train_batch_size: 128
50
+ - total_eval_batch_size: 4
51
+ - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
52
+ - lr_scheduler_type: cosine
53
+ - num_epochs: 3.0
54
+
55
+ ### Training results
56
+
57
+ | Training Loss | Epoch | Step | Gen Len | Validation Loss | Rouge1 | Rouge2 | Rougel | Rougelsum |
58
+ |:-------------:|:-----:|:----:|:-------:|:---------------:|:-------:|:------:|:-------:|:---------:|
59
+ | 5.0389 | 0.99 | 37 | 219.03 | 5.1884 | 29.995 | 4.4045 | 12.8837 | 27.557 |
60
+ | 4.8986 | 1.0 | 75 | 5.1286 | 26.921 | 3.7193 | 11.3605| 25.3492 | 276.005 |
61
+ | 4.5928 | 2.0 | 150 | 4.9900 | 26.6667 | 3.7342 | 11.8223| 24.7087 | 178.775 |
62
+ | 4.6159 | 3.0 | 225 | 4.9519 | 21.8058 | 2.9343 | 10.3717| 20.1537 | 106.055 |
63
+
64
+
65
+ ### Framework versions
66
+
67
+ - Transformers 4.25.0.dev0
68
+ - Pytorch 1.13.0+cu117
69
+ - Datasets 2.6.1
70
+ - Tokenizers 0.13.1