mwitiderrick commited on
Commit
89f3303
1 Parent(s): 15be350

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +1 -1
README.md CHANGED
@@ -13,4 +13,4 @@ This DistilBERT was sparsified using the [SparseML](https://github.com/neuralmag
13
 
14
  # Sparse Transfer 80% VNNI Pruned DistilBERT
15
 
16
- This model is the result of pruning the DistilBERT model to 80% using the VNNI blocking (semi-structured), followed by fine-tuning and quantization on the SST2 dataset. Pruning is performed with the GMP algorithm and using the masked language modeling task based on the BookCorpus and Wikipedia datasets. It achieves 90.5% accuracy on the validation dataset, recovering over 99% of the accuracy of the baseline model. See the included [recipe](https://sparsezoo.neuralmagic.com/models/nlp%2Fsentiment_analysis%2Fdistilbert-none%2Fpytorch%2Fhuggingface%2Fsst2%2Fpruned80_quant-none-vnni) for training instructions.
 
13
 
14
  # Sparse Transfer 80% VNNI Pruned DistilBERT
15
 
16
+ This model is the result of pruning the DistilBERT model to 80% using the VNNI blocking (semi-structured), followed by fine-tuning and quantization on the SST2 dataset. Pruning is performed with the GMP algorithm and using the masked language modeling task based on the BookCorpus and Wikipedia datasets. It achieves 90.5% accuracy on the validation dataset, recovering over 99% of the accuracy of the baseline model. See the included [recipe](https://sparsezoo.neuralmagic.com/models/distilbert-sst2_wikipedia_bookcorpus-pruned80.4block_quantized?comparison=distilbert-sst2_wikipedia_bookcorpus-base) for training instructions.