Error from tokenizer config

#26
by SowmithR - opened

I am having error while reading the model, didnt know where to raise the bug, so writing it here.

tokenizer = AutoTokenizer.from_pretrained('NousResearch/Hermes-2-Pro-Llama-3-8B')
Traceback (most recent call last):
File "", line 1, in
File "/opt/homebrew/anaconda3/lib/python3.11/site-packages/transformers/models/auto/tokenization_auto.py", line 834, in from_pretrained
tokenizer_config = get_tokenizer_config(pretrained_model_name_or_path, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/opt/homebrew/anaconda3/lib/python3.11/site-packages/transformers/models/auto/tokenization_auto.py", line 688, in get_tokenizer_config
result = json.load(reader)
^^^^^^^^^^^^^^^^^
File "/opt/homebrew/anaconda3/lib/python3.11/json/init.py", line 293, in load
return loads(fp.read(),
^^^^^^^^^^^^^^^^
File "/opt/homebrew/anaconda3/lib/python3.11/json/init.py", line 346, in loads
return _default_decoder.decode(s)
^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/opt/homebrew/anaconda3/lib/python3.11/json/decoder.py", line 337, in decode
obj, end = self.raw_decode(s, idx=_w(s, 0).end())
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/opt/homebrew/anaconda3/lib/python3.11/json/decoder.py", line 353, in raw_decode
obj, end = self.scan_once(s, idx)
^^^^^^^^^^^^^^^^^^^^^^
json.decoder.JSONDecodeError: Expecting ',' delimiter: line 2061 column 5 (char 55787)

Latest code changes on this 2 hours ago: https://huggingface.co/NousResearch/Hermes-2-Pro-Llama-3-8B/commit/e52178d17276cb3738f158b2ec6d6a8a0140bf7d

@interstellarninja can you please have a check

I can confirm that this matches my experience with this model revision as well.

same error here

The last } in line 2061 needs to be deleted

@interstellarninja could you please make the changed proposed by ningpengtao, I too think this is the source of the current error.
I would simply remove the } at the end of line 260, rather than 261, to remain consistent with the rest of the file.

NousResearch org
interstellarninja changed discussion status to closed

Sign up or log in to comment