vLLM compatible?

#10
by nickandbro - opened

Would be nice to see this in vLLM

hello @nickandbro

Please make sure that you've enabled vLLM in your local apps at https://huggingface.co/settings/local-apps

Afterwards, you should be able to see vLLM snippet under Use this Model dropdown on https://huggingface.co/nvidia/Llama-3_1-Nemotron-51B-Instruct

Please let me know if it works

Screenshot from 2024-10-25 14-24-18.png

Hi. I am getting this error. Could you please help me to fix this error?

Sign up or log in to comment