YAML Metadata
Warning:
empty or missing yaml metadata in repo card
(https://huggingface.co/docs/hub/model-cards#model-card-metadata)
Ar-Mulitlingual-MiniLM This model is a fine-tuned version of microsoft/Multilingual-MiniLM-L12-H384 on an unknown dataset.
Model description More information needed
Intended uses & limitations More information needed
Training and evaluation data More information needed
Training procedure Training hyperparameters The following hyperparameters were used during training:
learning_rate: 5e-05 train_batch_size: 24 eval_batch_size: 8 seed: 42 optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 lr_scheduler_type: linear num_epochs: 2 mixed_precision_training: Native AMP Training results Framework versions Transformers 4.18.0 Pytorch 1.11.0+cu113 Tokenizers 0.12.1
- Downloads last month
- 4
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social
visibility and check back later, or deploy to Inference Endpoints (dedicated)
instead.