--- license: apache-2.0 tags: - generated_from_trainer base_model: TinyLlama/TinyLlama-1.1B-Chat-v1.0 metrics: - accuracy - precision - recall model-index: - name: tiny-llama-lora-new results: [] --- # tiny-llama-lora-new This model is a fine-tuned version of [TinyLlama/TinyLlama-1.1B-Chat-v1.0](https://huggingface.co/TinyLlama/TinyLlama-1.1B-Chat-v1.0) on an unknown dataset. It achieves the following results on the evaluation set: - Loss: 1.2252 - Accuracy: 0.8203 - Precision: 0.8184 - Recall: 0.8203 - Precision Macro: 0.7732 - Recall Macro: 0.7380 - Macro Fpr: 0.0162 - Weighted Fpr: 0.0154 - Weighted Specificity: 0.9743 - Macro Specificity: 0.9863 - Weighted Sensitivity: 0.8203 - Macro Sensitivity: 0.7380 - F1 Micro: 0.8203 - F1 Macro: 0.7435 - F1 Weighted: 0.8173 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 8 - eval_batch_size: 8 - seed: 42 - gradient_accumulation_steps: 4 - total_train_batch_size: 32 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 15 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | Precision | Recall | Precision Macro | Recall Macro | Macro Fpr | Weighted Fpr | Weighted Specificity | Macro Specificity | Weighted Sensitivity | Macro Sensitivity | F1 Micro | F1 Macro | F1 Weighted | |:-------------:|:-----:|:----:|:---------------:|:--------:|:---------:|:------:|:---------------:|:------------:|:---------:|:------------:|:--------------------:|:-----------------:|:--------------------:|:-----------------:|:--------:|:--------:|:-----------:| | No log | 1.0 | 160 | 0.6615 | 0.8002 | 0.8040 | 0.8002 | 0.7266 | 0.6678 | 0.0182 | 0.0175 | 0.9726 | 0.9848 | 0.8002 | 0.6678 | 0.8002 | 0.6790 | 0.7959 | | No log | 2.0 | 321 | 0.6996 | 0.8064 | 0.8110 | 0.8064 | 0.7448 | 0.7207 | 0.0177 | 0.0169 | 0.9737 | 0.9853 | 0.8064 | 0.7207 | 0.8064 | 0.7235 | 0.8039 | | No log | 3.0 | 482 | 0.8202 | 0.8125 | 0.8119 | 0.8125 | 0.7577 | 0.7080 | 0.0171 | 0.0162 | 0.9711 | 0.9856 | 0.8125 | 0.7080 | 0.8125 | 0.7180 | 0.8085 | | 0.2932 | 4.0 | 643 | 0.9493 | 0.8141 | 0.8204 | 0.8141 | 0.7593 | 0.7327 | 0.0166 | 0.0160 | 0.9744 | 0.9859 | 0.8141 | 0.7327 | 0.8141 | 0.7415 | 0.8154 | | 0.2932 | 5.0 | 803 | 1.0610 | 0.8110 | 0.8110 | 0.8110 | 0.7596 | 0.7427 | 0.0172 | 0.0164 | 0.9738 | 0.9857 | 0.8110 | 0.7427 | 0.8110 | 0.7413 | 0.8087 | | 0.2932 | 6.0 | 964 | 1.1362 | 0.8149 | 0.8160 | 0.8149 | 0.7731 | 0.7380 | 0.0167 | 0.0160 | 0.9741 | 0.9859 | 0.8149 | 0.7380 | 0.8149 | 0.7408 | 0.8128 | | 0.0107 | 7.0 | 1125 | 1.1713 | 0.8102 | 0.8123 | 0.8102 | 0.7734 | 0.7310 | 0.0171 | 0.0165 | 0.9736 | 0.9856 | 0.8102 | 0.7310 | 0.8102 | 0.7343 | 0.8085 | | 0.0107 | 8.0 | 1286 | 1.1786 | 0.8156 | 0.8141 | 0.8156 | 0.7656 | 0.7349 | 0.0166 | 0.0159 | 0.9740 | 0.9860 | 0.8156 | 0.7349 | 0.8156 | 0.7374 | 0.8128 | | 0.0107 | 9.0 | 1446 | 1.1960 | 0.8187 | 0.8170 | 0.8187 | 0.7693 | 0.7368 | 0.0163 | 0.0156 | 0.9743 | 0.9862 | 0.8187 | 0.7368 | 0.8187 | 0.7400 | 0.8157 | | 0.0016 | 10.0 | 1607 | 1.2049 | 0.8156 | 0.8150 | 0.8156 | 0.7659 | 0.7353 | 0.0166 | 0.0159 | 0.9741 | 0.9860 | 0.8156 | 0.7353 | 0.8156 | 0.7376 | 0.8131 | | 0.0016 | 11.0 | 1768 | 1.2137 | 0.8156 | 0.8147 | 0.8156 | 0.7661 | 0.7353 | 0.0166 | 0.0159 | 0.9741 | 0.9860 | 0.8156 | 0.7353 | 0.8156 | 0.7377 | 0.8130 | | 0.0016 | 12.0 | 1929 | 1.2158 | 0.8156 | 0.8145 | 0.8156 | 0.7664 | 0.7353 | 0.0166 | 0.0159 | 0.9739 | 0.9860 | 0.8156 | 0.7353 | 0.8156 | 0.7379 | 0.8129 | | 0.0011 | 13.0 | 2089 | 1.2202 | 0.8187 | 0.8169 | 0.8187 | 0.7720 | 0.7372 | 0.0163 | 0.0156 | 0.9741 | 0.9862 | 0.8187 | 0.7372 | 0.8187 | 0.7425 | 0.8158 | | 0.0011 | 14.0 | 2250 | 1.2229 | 0.8187 | 0.8169 | 0.8187 | 0.7720 | 0.7372 | 0.0163 | 0.0156 | 0.9741 | 0.9862 | 0.8187 | 0.7372 | 0.8187 | 0.7425 | 0.8158 | | 0.0011 | 14.93 | 2400 | 1.2252 | 0.8203 | 0.8184 | 0.8203 | 0.7732 | 0.7380 | 0.0162 | 0.0154 | 0.9743 | 0.9863 | 0.8203 | 0.7380 | 0.8203 | 0.7435 | 0.8173 | ### Framework versions - Transformers 4.35.2 - Pytorch 2.1.0+cu121 - Datasets 2.19.0 - Tokenizers 0.15.1