<p> | |
This open-source model was created by <a href="https://qwenlm.github.io/">The Qwen Team of Alibaba cloud <a>. | |
You can find the release blog post <a href="https://qwenlm.github.io/blog/qwen2.5/">here</a>. | |
The model is available on the huggingface hub: <a href="https://huggingface.co/Qwen/Qwen2.5-7B-Instruct">https://huggingface.co/Qwen/Qwen2.5-7B-Instruct</a>. | |
The 7B model was pretrained on 18 trillion tokens spanning 29 languages. | |
It supports up to 128K tokens and can generate up to 8K tokens. | |
</p> | |