<p> | |
This open-source model was created by <a href="https://mistral.ai/">Mistral AI<a>. | |
You can find the release blog post <a href="https://mistral.ai/news/mixtral-of-experts/">here</a>. | |
The model is available on the huggingface hub: <a href="https://huggingface.co/mistralai/Mixtral-8x7B-Instruct-v0.1">https://huggingface.co/mistralai/Mixtral-8x7B-Instruct-v0.1</a>. | |
The model has 46.7B total and 12.9B active parameters. It supports up to 32K token contexts. | |
</p> | |