How to retrain the model with large max length?

#46
by lucasjin - opened

How to retrain the model with large max length?

Yes please! The output token limit is really small to make any use of the model.

yes, you were right, am seeking the way to make it, any thoughts?

Gradient AI was able to turn Llama-3-base with context length 8k to a 1048k context length model : https://huggingface.co/gradientai/Llama-3-8B-Instruct-Gradient-1048k
They used YaRN method to increase the context length: https://arxiv.org/abs/2309.00071
Maybe something similar can be done here?

Actually we can direclty change to labels to 2048 or 4096 which is enought for any dense ocr.
But, question is,

when i chagne to 2048 as label, when doing loss, it crashed.

I don't know it crash, therotically it shouldn't, same as we expand LLM's output lenght, it shouldn't limite to this

Actually we can direclty change to labels to 2048 or 4096 which is enought for any dense ocr.
But, question is,

when i chagne to 2048 as label, when doing loss, it crashed.

I don't know it crash, therotically it shouldn't, same as we expand LLM's output lenght, it shouldn't limite to this

Hi, I changed the max_length to 2048 and it works. Perhaps you should change all the 'max' subject and 'num_pos' from 1024 to 2048 in every files of this model, and add 'ignore_mismatched_size=True' in 'from_pretrained' method if you want to load some ckpt.

Sign up or log in to comment