"Model type not found" error
#18
by
tript
- opened
@wanghaofan
Inference not working; receiving error:
"Model type not found"
Getting error when deploying it to Spaces as well when the container starts to run
There's an error in the input stream and the logs cannot be accessed.
For the past few hours, almost all Spaces in HF have been buildable but not working. I don't know if it's a problem with this model or not.
https://status.huggingface.co/
https://discuss.huggingface.co/t/504-gateway-time-out/107971/
I looked into it. There are several problems with the setup of this repo.
- Pipeline is incorrectly configured.
https://huggingface.co/Shakker-Labs/FLUX.1-dev-ControlNet-Union-Pro/edit/main/README.md
pipeline_tag: text-to-image
This maybe should be:
pipeline_tag: image-to-image
https://huggingface.co/tasks/image-to-image - Serverless Inference of the controlnet does not work in many cases, so in some cases it is better to turn off Inference itself
- Other minor issues
So, is there any workaround to run the inference?
Unless the repo author or HF fixes it, the only way is a paid Endpoint API.
@wanghaofan , @ShakkerAi-Labs , can you please fix this