Context length is still 4096
#7
by
Shahrukh181
- opened
I am using TheBloke's quantization of this model and found that this model has a context length of 4096 without any changes to the code. I read that this model had 16k context length without any changes and could go up to 100k. why is that?
Is that WizardCoder34B quantized model?
yeah
The bloke version gguf shows 16k context
WizardLM
changed discussion status to
closed