Join the conversation

Join the community of Machine Learners and AI enthusiasts.

Sign Up
gtvracer 
posted an update Oct 13
Post
1450
Hello Everyone,
I signed up as Pro and started a ZeroGPU space with a Gradio chatbot project as default. When building the space, it won't even start the sample Gradio app.. Pretty disappointing when right out of the box, it fails...

Have anyone encountered this yet?
Thanks...

This is the output, odd since it seems to be just a warning. So why wouldn't it start?

/usr/local/lib/python3.10/site-packages/gradio/components/chatbot.py:228: UserWarning: The 'tuples' format for chatbot messages is deprecated and will be removed in a future version of Gradio. Please set type='messages' instead, which uses openai-style 'role' and 'content' keys.
warnings.warn(
* Running on local URL: http://0.0.0.0:7860, with SSR ⚡

To create a public link, set share=True in launch().

Stopping Node.js server...

There's a little trick to programming the Zero GPU space. Well, this is for Python, but what you do in JavaScript would be no different.
But still, why is it Warning when it would normally crash with an error...?
https://discuss.huggingface.co/t/issues-with-sadtalker-zerogpu-spaces-inquiry-about-community-grant/110625/11

·

Thanks John6666! Adding the import and spaces.GPU fixed the issue and the base ZeroGPU gradio app runs!

Hope HuggingFace fixes their template so others aren't facing this as their first experience with this platform. Because it really doesn't look good if the basic project supplied by HF won't run.