MoritzLaurer HF staff commited on
Commit
4af99d9
1 Parent(s): 0f81ba4

the model is supported by TGI

Browse files

I've just deployed a TGI HF endpoint and it worked, so it seems like the model is supported by TGI. The support of Mixtral is also mentioned in the [TGI docs](https://huggingface.co/docs/text-generation-inference/supported_models).

Files changed (1) hide show
  1. README.md +1 -1
README.md CHANGED
@@ -242,7 +242,7 @@ It is strongly recommended to use the text-generation-webui one-click-installers
242
  <!-- README_GPTQ.md-use-from-tgi start -->
243
  ## Serving this model from Text Generation Inference (TGI)
244
 
245
- Not currently supported for Mixtral models.
246
 
247
  <!-- README_GPTQ.md-use-from-tgi end -->
248
  <!-- README_GPTQ.md-use-from-python start -->
 
242
  <!-- README_GPTQ.md-use-from-tgi start -->
243
  ## Serving this model from Text Generation Inference (TGI)
244
 
245
+ The model is supported by [TGI](https://huggingface.co/docs/text-generation-inference/supported_models)
246
 
247
  <!-- README_GPTQ.md-use-from-tgi end -->
248
  <!-- README_GPTQ.md-use-from-python start -->