metadata
tags:
- adapterhub:nli/rte
- adapter-transformers
- distilbert
- text-classification
license: apache-2.0
Adapter distilbert-base-uncased_nli_rte_houlsby
for distilbert-base-uncased
Adapter for distilbert-base-uncased in Houlsby architecture trained on the RTE dataset for 15 epochs with early stopping and a learning rate of 1e-4.
This adapter was created for usage with the Adapters library.
Usage
First, install adapters
:
pip install -U adapters
Now, the adapter can be loaded and activated like this:
from adapters import AutoAdapterModel
model = AutoAdapterModel.from_pretrained("distilbert-base-uncased")
adapter_name = model.load_adapter("AdapterHub/distilbert-base-uncased_nli_rte_houlsby")
model.set_active_adapters(adapter_name)
Architecture & Training
- Adapter architecture: houlsby
- Prediction head: classification
- Dataset: RTE
Author Information
- Author name(s): Clifton Poth
- Author email: [email protected]
- Author links: Website, GitHub, Twitter
Citation
This adapter has been auto-imported from https://github.com/Adapter-Hub/Hub/blob/master/adapters/ukp/distilbert-base-uncased_nli_rte_houlsby.yaml.