Edit model card

Full Parameter Finetuning 0.5B 8192 context length Qwen 2 on Malaysian text 12.86B tokens

Continue pretraining https://huggingface.co/Qwen/Qwen2-0.5B on 12.86B tokens Malaysian dataset using 8192 context length.

WanDB at https://wandb.ai/huseinzol05/Qwen2-0.5B-fpf/workspace?nw=nwuserhuseinzol05

Downloads last month
15
Safetensors
Model size
494M params
Tensor type
BF16
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Collection including mesolitica/Qwen2-0.5B-8192-fpf