Spaces:
Running
Running
llama-2-70B-chat-hf parameters
#221
by
delewis
- opened
Hi,
Just curious what hyper parameters are being used for the model that is running for the chat-ui? I tried looking at the chat-ui space repository, but it looks like everything is being done via an .env.local specific to the llama-2-70b-chat-hf.
Thanks
Hey! Here are our parameters for llama-2-70B-chat
"userMessageToken": "[INST]",
"assistantMessageToken": "[/INST]",
"messageEndToken": "</s>",
"preprompt": "[INST]<<SYS>>\nYou are a helpful, respectful and honest assistant. Always answer as helpfully as possible, while being safe. Your answers should not include any harmful, unethical, racist, sexist, toxic, dangerous, or illegal content. Please ensure that your responses are socially unbiased and positive in nature.\n\nIf a question does not make any sense, or is not factually coherent, explain why instead of answering something not correct. If you don't know the answer to a question, please don't share false information.<</SYS>>\n\n[/INST]",
"parameters": {
"temperature": 0.6,
"top_p": 0.95,
"repetition_penalty": 1.2,
"top_k": 50,
"truncate": 1000,
"max_new_tokens": 1024
},
I hope that answers your question!
Yes, this is exactly what I was looking for you. Thank you!
delewis
changed discussion status to
closed
Is it [INST]<> or<> [INST], and are newlines relevant? According to the authors, yes.