tokenizer.chat_template
#2
by
leonardlin
- opened
If anyone wants to use HF's new chat templates here's the formatting exactly matching the output docs:
tokenizer.chat_template = "{% if not add_generation_prompt is defined %}{% set add_generation_prompt = false %}{% endif %}{% for message in messages %}{% if message['role'] == 'system' %}{{ message['content'] + '\n\n' }}{% elif message['role'] == 'user' %}{{'### æç€ș:\n' + message['content'] + '\n\n'}}{% elif message['role'] == 'assistant' %}{{'### ćżç:\n' + message['content'] + '\n\n'}}{% endif %}{% endfor %}{% if add_generation_prompt %}{{ '### ćżç:' }}{% endif %}"
The roles are system
, user
, and assistant
, and you could start with the suggested prompt:
PROMPT = "仄äžă«ăăăăżăčăŻăèȘŹæăăæç€șăăăăŸăăăȘăŻăšăčăăé©ćă«ćźäșăăăăăźćçăèšèż°ăăŠăă ăăă"
chat = []
chat.append({"role": "system", "content": PROMPT})
For those looking for MT-Bench formatting, I also made a version that's close (not sure if the ADD_COLON_SINGLE adds the appropriate \n or not): https://github.com/AUGMXNT/shisa/wiki/Evals-:-JA-MT%E2%80%90Bench#swallow