File size: 4,875 Bytes
fa2c1a7
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
---
license: mit
tags:
- generated_from_trainer
model-index:
- name: ec-biogpt-noised-pubmed-v3
  results: []
---

<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->

# ec-biogpt-noised-pubmed-v3

This model is a fine-tuned version of [microsoft/biogpt](https://huggingface.co/microsoft/biogpt) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 1.7552

## Model description

More information needed

## Intended uses & limitations

More information needed

## Training and evaluation data

More information needed

## Training procedure

### Training hyperparameters

The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 10
- num_epochs: 5
- mixed_precision_training: Native AMP

### Training results

| Training Loss | Epoch | Step  | Validation Loss |
|:-------------:|:-----:|:-----:|:---------------:|
| 1.9981        | 0.07  | 500   | 1.8163          |
| 1.7501        | 0.14  | 1000  | 1.7809          |
| 2.0623        | 0.22  | 1500  | 1.7638          |
| 1.8094        | 0.29  | 2000  | 1.7458          |
| 1.8711        | 0.36  | 2500  | 1.7326          |
| 1.6588        | 0.43  | 3000  | 1.7244          |
| 1.5469        | 0.5   | 3500  | 1.7153          |
| 1.6981        | 0.57  | 4000  | 1.7084          |
| 1.6728        | 0.65  | 4500  | 1.7025          |
| 1.8203        | 0.72  | 5000  | 1.6973          |
| 1.8318        | 0.79  | 5500  | 1.6924          |
| 1.6916        | 0.86  | 6000  | 1.6906          |
| 1.6369        | 0.93  | 6500  | 1.6816          |
| 1.4371        | 1.01  | 7000  | 1.6838          |
| 1.381         | 1.08  | 7500  | 1.6829          |
| 1.6214        | 1.15  | 8000  | 1.6846          |
| 1.6218        | 1.22  | 8500  | 1.6790          |
| 1.6278        | 1.29  | 9000  | 1.6788          |
| 1.4046        | 1.36  | 9500  | 1.6774          |
| 1.4866        | 1.44  | 10000 | 1.6728          |
| 1.4712        | 1.51  | 10500 | 1.6716          |
| 1.5896        | 1.58  | 11000 | 1.6702          |
| 1.4818        | 1.65  | 11500 | 1.6681          |
| 1.4261        | 1.72  | 12000 | 1.6638          |
| 1.5318        | 1.79  | 12500 | 1.6624          |
| 1.4814        | 1.87  | 13000 | 1.6620          |
| 1.5131        | 1.94  | 13500 | 1.6583          |
| 1.3971        | 2.01  | 14000 | 1.6806          |
| 1.4146        | 2.08  | 14500 | 1.6842          |
| 1.5739        | 2.15  | 15000 | 1.6888          |
| 1.312         | 2.23  | 15500 | 1.6857          |
| 1.4992        | 2.3   | 16000 | 1.6876          |
| 1.2725        | 2.37  | 16500 | 1.6845          |
| 1.3904        | 2.44  | 17000 | 1.6840          |
| 1.4569        | 2.51  | 17500 | 1.6855          |
| 1.4358        | 2.58  | 18000 | 1.6811          |
| 1.4747        | 2.66  | 18500 | 1.6814          |
| 1.3272        | 2.73  | 19000 | 1.6818          |
| 1.3743        | 2.8   | 19500 | 1.6756          |
| 1.3953        | 2.87  | 20000 | 1.6759          |
| 1.4173        | 2.94  | 20500 | 1.6748          |
| 1.3998        | 3.02  | 21000 | 1.7133          |
| 1.3396        | 3.09  | 21500 | 1.7205          |
| 1.1953        | 3.16  | 22000 | 1.7218          |
| 1.2047        | 3.23  | 22500 | 1.7223          |
| 1.0788        | 3.3   | 23000 | 1.7214          |
| 1.3048        | 3.37  | 23500 | 1.7230          |
| 1.3271        | 3.45  | 24000 | 1.7195          |
| 1.4236        | 3.52  | 24500 | 1.7208          |
| 1.1851        | 3.59  | 25000 | 1.7209          |
| 1.285         | 3.66  | 25500 | 1.7207          |
| 1.3013        | 3.73  | 26000 | 1.7174          |
| 1.2734        | 3.81  | 26500 | 1.7182          |
| 1.3496        | 3.88  | 27000 | 1.7168          |
| 1.3628        | 3.95  | 27500 | 1.7134          |
| 1.0063        | 4.02  | 28000 | 1.7507          |
| 1.1155        | 4.09  | 28500 | 1.7557          |
| 1.1886        | 4.16  | 29000 | 1.7571          |
| 1.1304        | 4.24  | 29500 | 1.7575          |
| 1.0328        | 4.31  | 30000 | 1.7563          |
| 1.2631        | 4.38  | 30500 | 1.7584          |
| 1.2212        | 4.45  | 31000 | 1.7564          |
| 1.1825        | 4.52  | 31500 | 1.7583          |
| 1.4374        | 4.6   | 32000 | 1.7562          |
| 1.1568        | 4.67  | 32500 | 1.7554          |
| 1.3035        | 4.74  | 33000 | 1.7565          |
| 1.27          | 4.81  | 33500 | 1.7557          |
| 1.2518        | 4.88  | 34000 | 1.7560          |
| 1.0965        | 4.95  | 34500 | 1.7552          |


### Framework versions

- Transformers 4.27.4
- Pytorch 2.0.0+cu117
- Datasets 2.11.0
- Tokenizers 0.13.3