Edit model card

gpt2-small-amharic-128-v3

This is a smaller version of the gpt2 decoder transformer model pretrained from scratch for 1.5 days on 290 million tokens of Amharic text.

  • It has 33.7 Million parameters
  • The context size of this model is 128 tokens.
  • It has the same tokenizer as gpt2, trained from scratch using the same dataset with a vocabulary size of 16384.
  • This is a base model and hasn't undergone any supervised finetuing yet.

It achieves the following results on the evaluation set:

  • Loss: 3.99
  • Perplexity: 54.17

Demo

You can use the following demo to generate text using gpt2-small-amharic. Please enter a prompt and click the Generate button to generate completions for the prompt.

https://huggingface.co/spaces/rasyosef/GPT2-Amharic

Downloads last month
23
Safetensors
Model size
33.7M params
Tensor type
F32
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Collection including rasyosef/gpt2-small-amharic-128-v3