vllm

Client Error : Can't load the model (missing config file)

#26
by benhachem - opened

Hello all!

I got this error while trying to load the model :

HTTPError: 403 Client Error: Forbidden for url: https://huggingface.co/mistralai/Pixtral-12B-2409/resolve/main/config.json 

And while checking in the repo Files, I don't see any config.json file, do you know any way arround this error ?

I used the exact same code in the Model Card for loading the model :

model_name = "mistralai/Pixtral-12B-2409"

sampling_params = SamplingParams(max_tokens=8192)

llm = LLM(model=model_name, tokenizer_mode="mistral")

Thanks !

Got the same error message with mlx_vlm. Now trying with mlx-community/pixtral-12b-bf16 where a config.json is present - it will be a while, though.

Edit: it runs that way.

Sign up or log in to comment