File size: 6,063 Bytes
8c200c5
 
 
 
 
 
 
 
132fc51
8c200c5
 
 
 
 
 
 
 
 
 
 
 
132fc51
 
 
 
8c200c5
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
---
license: apache-2.0
tags:
- generated_from_trainer
metrics:
- accuracy
- precision
- recall
base_model: facebook/bart-base
model-index:
- name: bart-base-lora
  results: []
---

<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->

# bart-base-lora

This model is a fine-tuned version of [facebook/bart-base](https://huggingface.co/facebook/bart-base) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 0.6614
- Accuracy: 0.7909
- Precision: 0.7794
- Recall: 0.7909
- Precision Macro: 0.6664
- Recall Macro: 0.6485
- Macro Fpr: 0.0194
- Weighted Fpr: 0.0186
- Weighted Specificity: 0.9735
- Macro Specificity: 0.9842
- Weighted Sensitivity: 0.7901
- Macro Sensitivity: 0.6485
- F1 Micro: 0.7901
- F1 Macro: 0.6250
- F1 Weighted: 0.7804

## Model description

More information needed

## Intended uses & limitations

More information needed

## Training and evaluation data

More information needed

## Training procedure

### Training hyperparameters

The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- gradient_accumulation_steps: 4
- total_train_batch_size: 32
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 15

### Training results

| Training Loss | Epoch | Step | Validation Loss | Accuracy | Precision | Recall | Precision Macro | Recall Macro | Macro Fpr | Weighted Fpr | Weighted Specificity | Macro Specificity | Weighted Sensitivity | Macro Sensitivity | F1 Micro | F1 Macro | F1 Weighted |
|:-------------:|:-----:|:----:|:---------------:|:--------:|:---------:|:------:|:---------------:|:------------:|:---------:|:------------:|:--------------------:|:-----------------:|:--------------------:|:-----------------:|:--------:|:--------:|:-----------:|
| No log        | 1.0   | 160  | 1.3205          | 0.6112   | 0.5322    | 0.6112 | 0.2887          | 0.3024       | 0.0464    | 0.0435       | 0.9266               | 0.9692            | 0.6112               | 0.3024            | 0.6112   | 0.2871   | 0.5575      |
| No log        | 2.0   | 321  | 0.8875          | 0.6995   | 0.6728    | 0.6995 | 0.3822          | 0.4254       | 0.0306    | 0.0298       | 0.9609               | 0.9774            | 0.6995               | 0.4254            | 0.6995   | 0.3948   | 0.6808      |
| No log        | 3.0   | 482  | 0.8427          | 0.7064   | 0.6952    | 0.7064 | 0.4131          | 0.4442       | 0.0295    | 0.0288       | 0.9641               | 0.9780            | 0.7064               | 0.4442            | 0.7064   | 0.3969   | 0.6752      |
| 1.2895        | 4.0   | 643  | 0.7719          | 0.7273   | 0.7132    | 0.7273 | 0.4198          | 0.4598       | 0.0264    | 0.0261       | 0.9690               | 0.9798            | 0.7273               | 0.4598            | 0.7273   | 0.4284   | 0.7167      |
| 1.2895        | 5.0   | 803  | 0.7388          | 0.7506   | 0.7400    | 0.7506 | 0.5733          | 0.5165       | 0.0239    | 0.0232       | 0.9697               | 0.9814            | 0.7506               | 0.5165            | 0.7506   | 0.5072   | 0.7368      |
| 1.2895        | 6.0   | 964  | 0.7526          | 0.7444   | 0.7337    | 0.7444 | 0.5703          | 0.5230       | 0.0247    | 0.0239       | 0.9691               | 0.9809            | 0.7444               | 0.5230            | 0.7444   | 0.5088   | 0.7268      |
| 0.7332        | 7.0   | 1125 | 0.7082          | 0.7552   | 0.7436    | 0.7552 | 0.5665          | 0.5728       | 0.0233    | 0.0226       | 0.9712               | 0.9818            | 0.7552               | 0.5728            | 0.7552   | 0.5609   | 0.7461      |
| 0.7332        | 8.0   | 1286 | 0.7161          | 0.7583   | 0.7489    | 0.7583 | 0.5641          | 0.5975       | 0.0228    | 0.0223       | 0.9721               | 0.9820            | 0.7583               | 0.5975            | 0.7583   | 0.5756   | 0.7503      |
| 0.7332        | 9.0   | 1446 | 0.6831          | 0.7777   | 0.7587    | 0.7777 | 0.5781          | 0.6069       | 0.0208    | 0.0200       | 0.9715               | 0.9833            | 0.7777               | 0.6069            | 0.7777   | 0.5875   | 0.7653      |
| 0.6167        | 10.0  | 1607 | 0.6683          | 0.7862   | 0.7714    | 0.7862 | 0.5917          | 0.6174       | 0.0198    | 0.0191       | 0.9728               | 0.9839            | 0.7862               | 0.6174            | 0.7862   | 0.5987   | 0.7754      |
| 0.6167        | 11.0  | 1768 | 0.6885          | 0.7761   | 0.7628    | 0.7761 | 0.5817          | 0.6220       | 0.0210    | 0.0202       | 0.9723               | 0.9832            | 0.7761               | 0.6220            | 0.7761   | 0.5946   | 0.7642      |
| 0.6167        | 12.0  | 1929 | 0.6830          | 0.7870   | 0.7826    | 0.7870 | 0.6627          | 0.6464       | 0.0197    | 0.0190       | 0.9734               | 0.9840            | 0.7870               | 0.6464            | 0.7870   | 0.6214   | 0.7764      |
| 0.5314        | 13.0  | 2089 | 0.6605          | 0.7916   | 0.7770    | 0.7916 | 0.5965          | 0.6358       | 0.0192    | 0.0185       | 0.9741               | 0.9844            | 0.7916               | 0.6358            | 0.7916   | 0.6111   | 0.7818      |
| 0.5314        | 14.0  | 2250 | 0.6614          | 0.7909   | 0.7794    | 0.7909 | 0.6368          | 0.6478       | 0.0193    | 0.0185       | 0.9729               | 0.9842            | 0.7909               | 0.6478            | 0.7909   | 0.6261   | 0.7803      |
| 0.5314        | 14.93 | 2400 | 0.6647          | 0.7901   | 0.7852    | 0.7901 | 0.6664          | 0.6485       | 0.0194    | 0.0186       | 0.9735               | 0.9842            | 0.7901               | 0.6485            | 0.7901   | 0.6250   | 0.7804      |


### Framework versions

- Transformers 4.35.2
- Pytorch 2.1.0+cu121
- Datasets 2.18.0
- Tokenizers 0.15.1