Axolotl Configs Settings
Hello, First of all, I would like to thank you for your work. If possible, would you be able to share the Axolotl Configs settings? I wish you good work.
Hi
@teknium
,
Following up on this.
Can you please share the Axolotl Configs settings? Thanks!
at least the special tokens part :)
at least the special tokens part :)
special_tokens:
pad_token: <|end_of_text|>
eos_token: <|im_end|>
but no <|im_start|> ?
@cinjonr i would say yes. I personally trained another llama3 with the following:
special_tokens:
eos_token: "<|im_end|>"
pad_token: "<|end_of_text|>"
tokens:
- "<|im_start|>"
if you pay for training cloud GPU services I strongly suggest to make a test first with a few rows, test inference to be sure the model does stop the generation.
but no <|im_start|> ?
No, no imstart token as a special token. It is an added token though, which you can use this as a reference for:
https://huggingface.co/NousResearch/Meta-Llama-3-8B-Alternate-Tokenizer