ofirzaf's picture
Update README.md
9e14763
|
raw
history blame
553 Bytes
metadata
language: en
license: apache-2.0
tags:
  - fill-mask
datasets:
  - wikipedia
  - bookcorpus

80% 1x4 Block Sparse BERT-Large (uncased) Prune OFA

This model is was created using Prune OFA method described in Prune Once for All: Sparse Pre-Trained Language Models presented in ENLSP NeurIPS Workshop 2021.

For further details on the model and its result, see our paper and our implementation available here.