|
--- |
|
tags: |
|
- adapter-transformers |
|
- adapterhub:hi/wiki |
|
- xlm-roberta |
|
language: |
|
- hi |
|
license: "apache-2.0" |
|
--- |
|
|
|
# Adapter `xlm-roberta-base-hi-wiki_pfeiffer` for xlm-roberta-base |
|
|
|
Pfeiffer Adapter trained with Masked Language Modelling on Hindi Wikipedia Articles for 250k steps and a batch size of 64. |
|
|
|
|
|
**This adapter was created for usage with the [Adapters](https://github.com/Adapter-Hub/adapters) library.** |
|
|
|
## Usage |
|
|
|
First, install `adapters`: |
|
|
|
``` |
|
pip install -U adapters |
|
``` |
|
|
|
Now, the adapter can be loaded and activated like this: |
|
|
|
```python |
|
from adapters import AutoAdapterModel |
|
|
|
model = AutoAdapterModel.from_pretrained("xlm-roberta-base") |
|
adapter_name = model.load_adapter("AdapterHub/xlm-roberta-base-hi-wiki_pfeiffer") |
|
model.set_active_adapters(adapter_name) |
|
``` |
|
|
|
## Architecture & Training |
|
|
|
- Adapter architecture: pfeiffer |
|
- Prediction head: None |
|
- Dataset: [hi/wiki](https://adapterhub.ml/explore/hi/wiki/) |
|
|
|
## Author Information |
|
|
|
- Author name(s): Jonas Pfeiffer |
|
- Author email: [email protected] |
|
- Author links: [Website](https://pfeiffer.ai), [GitHub](https://github.com/jopfeiff), [Twitter](https://twitter.com/@PfeiffJo) |
|
|
|
|
|
|
|
## Citation |
|
|
|
```bibtex |
|
@article{pfeiffer20madx, |
|
title={{MAD-X}: An {A}dapter-based {F}ramework for {M}ulti-task {C}ross-lingual {T}ransfer}, |
|
author={Pfeiffer, Jonas and Vuli\'{c}, Ivan and Gurevych, Iryna and Ruder, Sebastian}, |
|
journal={arXiv preprint}, |
|
year={2020}, |
|
url={https://arxiv.org/pdf/2005.00052.pdf}, |
|
} |
|
|
|
``` |
|
|
|
*This adapter has been auto-imported from https://github.com/Adapter-Hub/Hub/blob/master/adapters/ukp/xlm-roberta-base-hi-wiki_pfeiffer.yaml*. |