Token length 3908?
#1
by
Yhyu13
- opened
while I am doing GPTQ quant for this model , I get error
Token indices sequence length is longer than the specified maximum sequence length for this model (3908 > 2048). Running this sequence through the model will result in indexing errors
Why is the token length changed?
The token length is definitely the default 2048. See https://huggingface.co/openaccess-ai-collective/jeopardy-bot/blob/main/config.json
winglian
changed discussion status to
closed