vocabulary size mismatch - wrong tokenizer.model
#21
by
mradermacher
- opened
The model has 32000 tokens, but tokenizer.model has 32001, which means it cannot be converted to gguf (at least not without fixing it).