Efficient Intermediate Task Selection Adapters
Collection
Adapters from the paper "What to Pre-Train on? Efficient Intermediate Task Selection" (Poth et al., 2021)
•
109 items
•
Updated
AdapterHub/roberta-base-pf-social_i_qa
for roberta-base
An adapter for the roberta-base
model that was trained on the social_i_qa dataset and includes a prediction head for multiple choice.
This adapter was created for usage with the adapter-transformers library.
First, install adapter-transformers
:
pip install -U adapter-transformers
Note: adapter-transformers is a fork of transformers that acts as a drop-in replacement with adapter support. More
Now, the adapter can be loaded and activated like this:
from transformers import AutoModelWithHeads
model = AutoModelWithHeads.from_pretrained("roberta-base")
adapter_name = model.load_adapter("AdapterHub/roberta-base-pf-social_i_qa", source="hf")
model.active_adapters = adapter_name
The training code for this adapter is available at https://github.com/adapter-hub/efficient-task-transfer. In particular, training configurations for all tasks can be found here.
Refer to the paper for more information on results.
If you use this adapter, please cite our paper "What to Pre-Train on? Efficient Intermediate Task Selection":
@inproceedings{poth-etal-2021-what-to-pre-train-on,
title={What to Pre-Train on? Efficient Intermediate Task Selection},
author={Clifton Poth and Jonas Pfeiffer and Andreas Rücklé and Iryna Gurevych},
booktitle = "Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing (EMNLP)",
month = nov,
year = "2021",
address = "Online",
publisher = "Association for Computational Linguistics",
url = "https://arxiv.org/abs/2104.08247",
pages = "to appear",
}