I could not install xformers on my M3.

#10
by keerthi1306logesh - opened
This comment has been hidden
This comment has been hidden
Alibaba-NLP org
edited Jul 15

Hi @keerthi1306logesh , you can use the eager attention in M3 chip machines. By default, the xformers are not needed.

This comment has been hidden
Alibaba-NLP org

Load the model in this way:

from transformers import AutoTokenizer,  AutoModel

tokenizer = AutoTokenizer.from_pretrained('YOUR_MODEL')
model = AutoModel.from_pretrained('YOUR_MODEL', attn_implementation='eager', trust_remote_code=True)
keerthi1306logesh changed discussion status to closed

Sign up or log in to comment