Edit model card

Swahili News Classification with RoBERTa

This model was trained using HuggingFace's Flax framework and is part of the JAX/Flax Community Week organized by HuggingFace. All training was done on a TPUv3-8 VM sponsored by the Google Cloud team.

This model was used as the base and fine-tuned for this task.

How to use

from transformers import AutoTokenizer, AutoModelForSequenceClassification
  
tokenizer = AutoTokenizer.from_pretrained("flax-community/roberta-swahili-news-classification")

model = AutoModelForSequenceClassification.from_pretrained("flax-community/roberta-swahili-news-classification")
Eval metrics: {'accuracy': 0.9153416415986249}
Downloads last month
23
Safetensors
Model size
105M params
Tensor type
I64
·
F32
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Dataset used to train flax-community/roberta-swahili-news-classification