license: mit | |
# This Model | |
This is a partially continued pretrained Llama 3.1 8B LLM (using unsloth/Meta-Llama-3.1-8B). Training was done on [200k articles from Arabic Wikipedia 2023](akhooli/arwiki_128). | |
This is just a proof of concept demo and should never be used for production. | |
The model was then used to for instruction fine-tuning for classical Arabic poetry generation (toy model: https://huggingface.co/akhooli/llama31ft). |