MoritzLaurer HF staff commited on
Commit
4bd04d9
1 Parent(s): a15d0e2

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +1 -1
README.md CHANGED
@@ -20,7 +20,7 @@ and in this [paper](https://arxiv.org/pdf/2312.17543.pdf).
20
  The foundation model is [microsoft/deberta-v3-xsmall](https://huggingface.co/microsoft/deberta-v3-xsmall).
21
  The model only has 22 million backbone parameters and 128 million vocabulary parameters.
22
  The backbone parameters are the main parameters active during inference, providing a significant speedup over larger models.
23
- The model is 241 MB small.
24
 
25
  This model was trained to provide a small and highly efficient zeroshot option,
26
  especially for edge devices or in-browser use-cases with transformers.js.
 
20
  The foundation model is [microsoft/deberta-v3-xsmall](https://huggingface.co/microsoft/deberta-v3-xsmall).
21
  The model only has 22 million backbone parameters and 128 million vocabulary parameters.
22
  The backbone parameters are the main parameters active during inference, providing a significant speedup over larger models.
23
+ The model is 142 MB small.
24
 
25
  This model was trained to provide a small and highly efficient zeroshot option,
26
  especially for edge devices or in-browser use-cases with transformers.js.