In the config, the max_position_embedding is 32768, however, I read it from somewhere, the model was trained on 8k context length.
Β· Sign up or log in to comment