Why max_length and max_position_embeddings are inconsistent? AceGPT is based on LLama-2, why not 4096.
AceGPT trained with max_length=2048
· Sign up or log in to comment