vLLM compatible?
#10
by
nickandbro
- opened
Would be nice to see this in vLLM
hello @nickandbro
Please make sure that you've enabled vLLM
in your local apps at https://huggingface.co/settings/local-apps
Afterwards, you should be able to see vLLM snippet under Use this Model
dropdown on https://huggingface.co/nvidia/Llama-3_1-Nemotron-51B-Instruct
Please let me know if it works
Thanks!