How to increase or decrease the context length?

#9
by CouchCommander - opened

I am trying to run my model with dynamic context length but it is not specified how to configure it.

Google org
edited Sep 17

Hi @CouchCommander , You can adjust the context length of the inputs using max_length or max_new_tokens in outputs = model.generate(**input_ids, max_length=100)or while instantiating the model.

Sign up or log in to comment