Text Generation
Transformers
Safetensors
English
olmo
conversational
Inference Endpoints

Question regarding max_position_embeddings in config.json

#1
by tarun360 - opened

Hello,

For allenai/OLMo-7B-0724-Instruct-hf, in the model card its mentioned that the context lenght is 4096. Hence, max_position_embeddings in config.json should be set as 4096 right? Currently, its set as 2048.

For allenai/OLMo-7B-0724-hf, its correctly set as 4096 here: https://huggingface.co/allenai/OLMo-7B-0724-hf/blob/main/config.json

Good check! Fixed.

natolambert changed discussion status to closed

Sign up or log in to comment