Multiple GPU inferencing
#13
by
cjj2003
- opened
Is it possible to run this on multiple GPUs? I tried the demo on 4 A6000s and it didn't only seemed to use the first GPU.
Hey @cjj2003 you can use this model to inference on multiple GPU's. Use this:
if num_gpus > 1:
model = torch.nn.DataParallel(model, device_ids=list(range(num_gpus)))
model.to(device)
amanrangapur
changed discussion status to
closed