File size: 1,635 Bytes
fc2848f 4c61a9c fc2848f 4c61a9c fc2848f 4c61a9c fc2848f 4c61a9c fc2848f 4c61a9c fc2848f 4c61a9c fc2848f 4c61a9c fc2848f 4c61a9c fc2848f 4c61a9c fc2848f 4c61a9c fc2848f 4c61a9c fc2848f 4c61a9c fc2848f 4c61a9c fc2848f 4c61a9c fc2848f |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 |
---
tags:
- adapter-transformers
- xlm-roberta
datasets:
- UKPLab/m2qa
---
# Adapter `AdapterHub/m2qa-xlm-roberta-base-mad-x-domain-product-reviews` for xlm-roberta-base
An [adapter](https://adapterhub.ml) for the `xlm-roberta-base` model that was trained on the [UKPLab/m2qa](https://huggingface.co/datasets/UKPLab/m2qa/) dataset.
This adapter was created for usage with the **[adapter-transformers](https://github.com/Adapter-Hub/adapter-transformers)** library.
## Usage
First, install `adapter-transformers`:
```
pip install -U adapter-transformers
```
_Note: adapter-transformers is a fork of transformers that acts as a drop-in replacement with adapter support. [More](https://docs.adapterhub.ml/installation.html)_
Now, the adapter can be loaded and activated like this:
```python
from transformers import AutoAdapterModel
model = AutoAdapterModel.from_pretrained("xlm-roberta-base")
adapter_name = model.load_adapter("AdapterHub/m2qa-xlm-roberta-base-mad-x-domain-product-reviews", source="hf", set_active=True)
```
## Architecture & Training
See our repository for more information: See https://github.com/UKPLab/m2qa/tree/main/Experiments/mad-x-domain
## Evaluation results
<!-- Add some description here -->
## Citation
```
@article{englaender-etal-2024-m2qa,
title="M2QA: Multi-domain Multilingual Question Answering",
author={Engl{\"a}nder, Leon and
Sterz, Hannah and
Poth, Clifton and
Pfeiffer, Jonas and
Kuznetsov, Ilia and
Gurevych, Iryna},
journal={arXiv preprint},
url="https://arxiv.org/abs/2407.01091",
month = jul,
year="2024"
}
``` |