Load model from huggingface
#16
by
HemanthSai7
- opened
I want to use this version of the model.
https://huggingface.co/TheBloke/Mistral-7B-Instruct-v0.1-GGUF/blob/main/mistral-7b-instruct-v0.1.Q4_K_M.gguf
What path should I specify in Langchain since I can't download and run the model due to low resources?
I don't think that's doable.
Ohh got it.
HemanthSai7
changed discussion status to
closed