neural-chat-mini-v2.2-1.8B / added_tokens.json
Locutusque's picture
Upload tokenizer
e4ac2b1 verified
raw
history blame contribute delete
80 Bytes
{
"<|endoftext|>": 151643,
"<|im_end|>": 151645,
"<|im_start|>": 151644
}