Getting "Could not load model" errors when using inference api.
#73
by
breisa
- opened
Why am I getting this error when using the inference api? My code used to work, I didn't change anything.
500 Internal Server Error: "{"error":"Could not load model facebook/bart-large-cnn with any of the following classes: (<class 'transformers.models.bart.modeling_bart.BartForConditionalGeneration'>, <class 'transformers.models.bart.modeling_tf_bart.TFBartForConditionalGeneration'>)."}"
This is a "500 Internal Server Error" . I just now tried loading, it's working. Re-try after some time.
Yeah, the issue still persists.
Confirm, the issue still persists
Could not load model facebook/bart-large-cnn with any of the following classes: (<class 'transformers.models.bart.modeling_bart.BartForConditionalGeneration'>, <class 'transformers.models.bart.modeling_tf_bart.TFBartForConditionalGeneration'>).
The above issue still persists.
Hi, Does anyone know how to fix the "Could not load model" error?
+1
+1
+1
+1