ITT-AF/ITT-AF-PLM-2.2B_v0.4
This model is a pretrained version on custom dataset(110G).
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
The data used to train the model is collected from various sources, mostly from the Web. As such, it contains offensive, harmful and biased content. We thus expect the model to exhibit such biases from the training data.
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 24
- eval_batch_size: 8
- seed: 42
- gradient_accumulation_steps: 4
- total_train_batch_size: 96
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 1.0
- mixed_precision_training: Native AMP
Training results
text = "νκ΅μ μλλ"
gen_text = "νκ΅μ μλλ μμΈμ΄λ€. κ·Έλ¬λ μμΈμ΄λΌλ λμλ κ·Έ μμ²΄κ° νλμ κ±°λν λμλ€. μμΈμ μ€μ¬μ κ΄νλ¬Έκ΄μ₯μ΄λ€."
Framework versions
- Transformers 4.36.2
- Pytorch 2.1.2+cu121
- Datasets 2.0.0
- Tokenizers 0.15.0
- Downloads last month
- 1,646
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social
visibility and check back later, or deploy to Inference Endpoints (dedicated)
instead.