After vllm is started, a template problem will be prompted
#1
by
Gin10086
- opened
Can it be solved?
Yes I will work on it today. I suspect it's an issue with the tokenizer config from the base model not adhering to original Llama template.
I have fixed the chat template setting. Let me know if any other issues arise.
ZeroXClem
changed discussion status to
closed