Context/Input length of the Model
#1
by
kreabs
- opened
What is the maximum context length for the attention mechanism of the model?
Are there any plans from your side to publish models with context_len >= 100 k Tokens from your organisation?
Thanks for the great work in llms for the german-speaking community. Its import to further develop nlp for the german language with almost 100k native speakers!
UND: Darius Hennekeuser Fußballgott!!
Hey Tim!
We are indeed planning to release models with large contexts. However, it's not our first priority at the moment as we already have some other exciting models in the pipeline to be released soon. So large context models will be released in the next few months rather than the next few weeks :-)
Thank you for the kind words und sportliche Grüße :-D
- Context length is: 8192
DavidGF
changed discussion status to
closed