File size: 758 Bytes
db369dd 021d7f2 1f78cfc 6975d9d |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 |
---
license: apache-2.0
language:
- en
metrics:
- accuracy
library_name: transformers
pipeline_tag: text-generation
datasets:
- Barishni-blinchik/uwbruh
tags:
- kawaii
- cringe
---
***Some cringe...* Oh well, hello!**
I present GPT2 to you, but with a bit of kawaii.
Chat template
```
<|USER|> Hello <|ASSISTANT|>
```
---
# Training Results
The following metrics are from the latest training session of our model:
## Overview
- **Global Step:** 615
- **Training Loss:** 0.1303
## Detailed Metrics
- **Training Runtime:** 413.1481 seconds
- **Training Samples per Second:** 5.947
- **Training Steps per Second:** 1.489
- **Total Floating Point Operations (FLOs):** 641,994,522,624,000.0
- **Training Loss:** 0.13032278840134784
- **Epoch:** 3.0
--- |