latest vllm docker (v0.6.2) fail to load
#1
by
choronz333
- opened
--model=neuralmagic/Phi-3.5-mini-instruct-FP8-KV
OSError: neuralmagic/Phi-3.5-mini-instruct-FP8-KV does not appear to have a file named configuration_phi3.py. Checkout 'https://huggingface.co/neuralmagic/Phi-3.5-mini-instruct-FP8-KV/tree/main' for available files.
choronz333
changed discussion title from
vllm docker load fail to load
to latest vllm docker (v0.6.2) fail to load
Yes, can confirm, seems like the problem comes from the OpenAI server. The offline code works.
UPD after trying and failing to start the OpenAI instance the offline code stopped working
Thanks for reporting this, I have added the config and modeling files.
mgoin
changed discussion status to
closed