`config_class` is not matched when loding HAT(I3) by AutoModelforSequenceClassification

#1
by Siki-77 - opened

Hi, there are some errors in your running instance using AutoModelforSequenceClassification :

tokenizer = AutoTokenizer.from_pretrained("kiddothe2b/hierarchical-transformer-I3-mini-1024", trust_remote_code=True) model = AutoModelForSequenceClassification.from_pretrained("kiddothe2b/hierarchical-transformer-I3-mini-1024", trust_remote_code=True).to(device)

I got the following error. Could u kindly give me some advice to fix it?

ValueError: The model class you are passing has a config_class attribute that is not consistent with the config class you passed (model has <class 'transformers_modules.kiddothe2b.hierarchical-transformer-I3-mini-1024.00a6645482b7a7e6ffe3362ef289cf1e90702634.modelling_hat.HATConfig'> and you passed <class 'transformers_modules.kiddothe2b.hierarchical-transformer-I3-mini-1024.00a6645482b7a7e6ffe3362ef289cf1e90702634.configuration_hat.HATConfig'>. Fix one of those so they match!

Sign up or log in to comment