"Model type not found" error

#18
by tript - opened

@wanghaofan Inference not working; receiving error:
"Model type not found"
image.png

Getting error when deploying it to Spaces as well when the container starts to run
There's an error in the input stream and the logs cannot be accessed.

For the past few hours, almost all Spaces in HF have been buildable but not working. I don't know if it's a problem with this model or not.
https://status.huggingface.co/
https://discuss.huggingface.co/t/504-gateway-time-out/107971/

@John6666 it's still not working

I looked into it. There are several problems with the setup of this repo.

So, is there any workaround to run the inference?

Unless the repo author or HF fixes it, the only way is a paid Endpoint API.

@wanghaofan , @ShakkerAi-Labs , can you please fix this

Sign up or log in to comment