Spaces:
Running
Running
zeroGPU takes more time for inference!
#127
by
alibabasglab
- opened
It is found that running inference is slower on zeroGPU than on local GPU (V100, 16GB). What is the possible overhead? Is it possible to speed it up?
alibabasglab
changed discussion title from
much slower inference on zeroGPU!
to zeroGPU takes more time for inference!