Yeji-Seong commited on
Commit
275d241
1 Parent(s): 61ccf36

End of training

Browse files
Files changed (1) hide show
  1. README.md +63 -0
README.md ADDED
@@ -0,0 +1,63 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ license: apache-2.0
3
+ library_name: peft
4
+ tags:
5
+ - generated_from_trainer
6
+ metrics:
7
+ - accuracy
8
+ base_model: distilbert-base-uncased
9
+ model-index:
10
+ - name: distilbert-base-uncased-textclassification_ia3
11
+ results: []
12
+ ---
13
+
14
+ <!-- This model card has been generated automatically according to the information the Trainer had access to. You
15
+ should probably proofread and complete it, then remove this comment. -->
16
+
17
+ # distilbert-base-uncased-textclassification_ia3
18
+
19
+ This model is a fine-tuned version of [distilbert-base-uncased](https://huggingface.co/distilbert-base-uncased) on an unknown dataset.
20
+ It achieves the following results on the evaluation set:
21
+ - Loss: 0.2471
22
+ - Accuracy: 0.8983
23
+
24
+ ## Model description
25
+
26
+ More information needed
27
+
28
+ ## Intended uses & limitations
29
+
30
+ More information needed
31
+
32
+ ## Training and evaluation data
33
+
34
+ More information needed
35
+
36
+ ## Training procedure
37
+
38
+ ### Training hyperparameters
39
+
40
+ The following hyperparameters were used during training:
41
+ - learning_rate: 2e-05
42
+ - train_batch_size: 16
43
+ - eval_batch_size: 16
44
+ - seed: 42
45
+ - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
46
+ - lr_scheduler_type: linear
47
+ - num_epochs: 2
48
+
49
+ ### Training results
50
+
51
+ | Training Loss | Epoch | Step | Validation Loss | Accuracy |
52
+ |:-------------:|:-----:|:----:|:---------------:|:--------:|
53
+ | 0.2509 | 1.0 | 1563 | 0.2474 | 0.8985 |
54
+ | 0.2538 | 2.0 | 3126 | 0.2471 | 0.8983 |
55
+
56
+
57
+ ### Framework versions
58
+
59
+ - PEFT 0.7.1
60
+ - Transformers 4.36.2
61
+ - Pytorch 2.0.0+cu117
62
+ - Datasets 2.16.1
63
+ - Tokenizers 0.15.0