Attempt to add in add_ Special_ Tokens, there is an error, how should I solve it
#19
by
wangchengfei
- opened
Here is my code section
tokenizer=AutoTokenizer.from_pretrained("Qwen/Qwen-7B", use_fast=False,trust_remote_code=True)
tokenizer.add_special_tokens({"pad_token":'[PAD]'})
But it made the following error:File "Qwen-7B/tokenization_qwen.py", line 165, in _add_tokens raise ValueError("Adding unknown special tokens is not supported") ValueError: Adding unknown special tokens is not supported
We have a custom check in the tokenzier. Please try using <|extra_0|>
and the like. Setting pad_token
to <|endoftext|>
should be generally okay for the pretrained models.