This open-source model was created by Mistral AI.
You can find the release blog post here.
The model is available on the huggingface hub: https://huggingface.co/mistralai/Mixtral-8x7B-Instruct-v0.1.
The model has 46.7B total and 12.9B active parameters. It supports up to 32K token contexts.