How to create in Ollama??
2023/12/16 21:21:42 parser.go:62: WARNING: Unknown command: [PAD51186]
2023/12/16 21:21:42 parser.go:62: WARNING: Unknown command: [PAD51187]
2023/12/16 21:21:42 parser.go:62: WARNING: Unknown command: [PAD51188]
2023/12/16 21:21:42 parser.go:62: WARNING: Unknown command: [PAD51189]
2023/12/16 21:21:42 parser.go:62: WARNING: Unknown command: [PAD51190]
2023/12/16 21:21:42 parser.go:62: WARNING: Unknown command: [PAD51191]
2023/12/16 21:21:42 parser.go:62: WARNING: Unknown command: [PAD51192]
2023/12/16 21:21:42 parser.go:62: WARNING: Unknown command: [PAD51193]
2023/12/16 21:21:42 parser.go:62: WARNING: Unknown command: [PAD51194]
2023/12/16 21:21:42 parser.go:62: WARNING: Unknown command: [PAD51195]
2023/12/16 21:21:42 parser.go:62: WARNING: Unknown command: [PAD51196]
2023/12/16 21:21:42 parser.go:62: WARNING: Unknown command: [PAD51197]
2023/12/16 21:21:42 parser.go:62: WARNING: Unknown command: [PAD51198]
Error: no FROM line for the model was specified
This is the error i am getting when i run ollama create phi-2-q4 -f ./phi-2_Q8_0.gguf
Any idea to create it
Some modification will need to be done as this isn't yet merged into llamacpp : https://github.com/mrgraycode/llama.cpp/commit/12cc80cb8975aea3bc9f39d3c9b84f7001ab94c5#diff-150dc86746a90bad4fc2c3334aeb9b5887b3adad3cc1459446717638605348efR6239 but you can fork it.
Yeah but it is not working in ollama
Support has been added to llama.cpp master so the ball is on Ollama now.
https://github.com/ggerganov/llama.cpp/commit/b9e74f9bca5fdf7d0a22ed25e7a9626335fdfa48
LM Studio Beta is updated.
So I am trying there now!
Thanks!!
I find it interesting for a only 3b parameters model you will soon be able to run anywhere. It won't do math or you prolly would have to implement a Chain of Thought in the prompts or external tools after processing.
@namankhator : thanks for the feedback! Please recall that this is a base completion model, so the format of your question really matters. When you give instruction I recommend using the format:
Instruct: YOUR INSTRUCTION
Output:
Moreover, for any kind of reasoning it's useful to add "Let's think step by step", even for easy questions. If you do both of those things, it works for your example.
Hey @sebubeck
Thanks for the recommendations.
I believe Instruct and Output are already set. (attached image from LM Studio)
I tried to use the prompt you asked but it still did not work.
I will try for tasks other than reasoning, and if need be will update.