Error converting mistralai/Mistral-7B-Instruct-v0.1

#33
by eastriver - opened

Conversion Settings:

        Model: mistralai/Mistral-7B-Instruct-v0.1
        Task: None
        Compute Units: None
        Precision: None
        Tolerance: None
        Push to: None

        Error: "mistral is not supported yet. Only ['bart', 'beit', 'bert', 'big_bird', 'bigbird_pegasus', 'blenderbot', 'blenderbot_small', 'bloom', 'convnext', 'ctrl', 'cvt', 'data2vec', 'distilbert', 'ernie', 'gpt2', 'gpt_bigcode', 'gptj', 'gpt_neo', 'gpt_neox', 'levit', 'llama', 'm2m_100', 'marian', 'mobilebert', 'mobilevit', 'mvp', 'pegasus', 'plbart', 'roberta', 'roformer', 'segformer', 'splinter', 'squeezebert', 't5', 'vit', 'yolos'] are supported. If you want to support mistral please propose a PR or open up an issue."
        
Core ML Projects org

Hi @eastriver ! Mistral support is being added to exporters here: https://github.com/huggingface/exporters/pull/62

I'll let you know when the converter supports it :)

Core ML Projects org

I just updated this conversion space to support Mistral. Please @eastriver let us know if it works for you :)

“mistral is not supported yet. Only [‘bart’, ‘beit’, ‘bert’, ‘big_bird’, ‘bigbird_pegasus’, ‘blenderbot’, ‘blenderbot_small’, ‘bloom’, ‘convnext’, ‘ctrl’, ‘cvt’, ‘data2vec’, ‘distilbert’, ‘ernie’, ‘gpt2’, ‘gpt_bigcode’, ‘gptj’, ‘gpt_neo’, ‘gpt_neox’, ‘levit’, ‘llama’, ‘m2m_100’, ‘marian’, ‘mobilebert’, ‘mobilevit’, ‘mvp’, ‘pegasus’, ‘plbart’, ‘roberta’, ‘roformer’, ‘segformer’, ‘splinter’, ‘squeezebert’, ‘t5’, ‘vit’, ‘yolos’] are supported. If you want to support mistral please propose a PR or open up an issue.”

Hi! Thank you for your work, @pcuenq .
However, unfortunately, at least the conversion through https://huggingface.co/spaces/coreml-projects/transformers-to-coreml still doesn't work.

I checked:
mistralai/Mistral-7B-v0.1
mistralai/Mistral-7B-Instruct-v0.1
HuggingFaceH4/zephyr-7b-alpha
HuggingFaceH4/zephyr-7b-beta

Core ML Projects org

Hello @eastriver ! My bad, I did not update the code properly, sorry about that. It's fixed now, so Mistral and Zephyr models are recognized. However, I did a test conversion of Zephyr and got an error at the end of the process ("Error in declaring network."), which is not happening when converting locally in my system. I'm investigating what could be the reason.

Sign up or log in to comment