This model is a instruct-tuned EleutherAI/polyglot-ko-1.3b model.
Training hyperparameters
- learning_rate: 5e-5
- train_batch_size: 1
- seed: 42
- distributed_type: multi-GPU (A30 24G) + CPU Offloading (384GB)
- num_devices: 2
- gradient_accumulation_steps: 32
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 2.0
Framework versions
- Transformers 4.34.1
- Pytorch 2.0.1+cu117
- Datasets 2.11.0
- deepspeed 0.9.5
- Downloads last month
- 4,416
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social
visibility and check back later, or deploy to Inference Endpoints (dedicated)
instead.
Model tree for DILAB-HYU/koquality-polyglot-1.3b
Base model
EleutherAI/polyglot-ko-1.3b