Text Generation
Transformers
GGUF
English
stablelm

Failed to create LLM 'stablelm'

#2
by jonathanjordan21 - opened

I got this error when using ctransformers, although the .gguf file was successfully downloaded

RuntimeError: Failed to create LLM 'stablelm' from '/root/.cache/huggingface/hub/models--TheBloke--Marx-3B-v3-GGUF/blobs/fb16032da1b4f68d465cb7b2164c5305be8a008657ed0cd6cbb91b3a94b032ee'.

This is the code example :

from ctransformers import AutoModelForCausalLM

llm = AutoModelForCausalLM.from_pretrained("TheBloke/Marx-3B-v3-GGUF", model_file="marx-3b-v3.Q3_K_S.gguf", model_type='stablelm')
print(llm("AI is going to"))

I got this error when using ctransformers, although the .gguf file was successfully downloaded

RuntimeError: Failed to create LLM 'stablelm' from '/root/.cache/huggingface/hub/models--TheBloke--Marx-3B-v3-GGUF/blobs/fb16032da1b4f68d465cb7b2164c5305be8a008657ed0cd6cbb91b3a94b032ee'.

This is the code example :

from ctransformers import AutoModelForCausalLM

llm = AutoModelForCausalLM.from_pretrained("TheBloke/Marx-3B-v3-GGUF", model_file="marx-3b-v3.Q3_K_S.gguf", model_type='stablelm')
print(llm("AI is going to"))

Not TheBloke, but the original creator here. I see you using cTransformers, and I don't think it supports StableLM models. However, the official GGUF project, Llama.cpp, do.

Sign up or log in to comment