weights in llama format
Can you provide the weights in llama format? Or maybe a script to do the conversion ourselves?
Can you provide the weights in llama format? Or maybe a script to do the conversion ourselves?
Hey bro, the Orion-14B models are slightly different with llama, so we could not provide the weights in llama format.
In fact, you can just download the Orion-14B- (/path/Orion-14B-Chat for example) and change the model-loading way like bellow
from model = AutoModelForCausalLM.from_pretrained("meta-llama/Llama-2-7b-chat-hf", **kwargs)
to model = AutoModelForCausalLM.from_pretrained("/path/Orion-14B-Chat", trust_remote_code=True, **kwargs)
Just have a try and enjoy!
I want to use LlamaForCausalLM
to load up the model, instead of OrionForCausalLM
which requires an additional step of using trust_remote_code=True
. I also plan to run the model using vllm. By transferring the model weights into LlamaForCausalLM
, I can make it work smoothly with vllm.