Why isn't the `model_max_length` set to 2048?

#32
by alvarobartt HF staff - opened
Hugging Face H4 org
β€’
edited Nov 28, 2023

Hi here πŸ€—

Asking from my misunderstanding, why is the model_max_length within the tokenizer_config.json set to 1000000000000000019884624838656? Shouldn't it be 2048 as per the Zephyr paper? Does this have any side/unintended effect? Is there any rationale behind it?

See https://huggingface.co/HuggingFaceH4/zephyr-7b-beta/blob/main/tokenizer_config.json#L38

aha, i have the same question.

Sign up or log in to comment