Edit model card

Spanish GPT-2

GPT-2 model trained from scratch on the Spanish portion of OSCAR. The model is trained with Flax and using TPUs sponsored by Google since this is part of the Flax/Jax Community Week organised by HuggingFace.

Model description

The model used for training is OpenAI's GPT-2, introduced in the paper "Language Models are Unsupervised Multitask Learners" by Alec Radford, Jeffrey Wu, Rewon Child, David Luan, Dario Amodei and Ilya Sutskever.

This model is available in the 🤗 Model Hub.

Training data

Spanish portion of OSCAR or Open Super-large Crawled ALMAnaCH coRpus, a huge multilingual corpus obtained by language classification and filtering of the Common Crawl corpus using the goclassy architecture.

This corpus is available in the 🤗 Datasets library.

Team members

Downloads last month
1,325
Safetensors
Model size
137M params
Tensor type
F32
·
U8
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for flax-community/gpt-2-spanish

Finetunes
2 models
Quantizations
1 model

Dataset used to train flax-community/gpt-2-spanish

Spaces using flax-community/gpt-2-spanish 2