runtime error
tlib-metadata>=6.6.0 in /home/user/.local/lib/python3.9/site-packages (from yapf->mmcv-full==1.5.0) (6.8.0) Collecting tomli>=2.0.1 Downloading tomli-2.0.1-py3-none-any.whl (12 kB) Requirement already satisfied: zipp>=0.5 in /home/user/.local/lib/python3.9/site-packages (from importlib-metadata>=6.6.0->yapf->mmcv-full==1.5.0) (3.17.0) Installing collected packages: addict, tomli, platformdirs, yapf, mmcv-full Successfully installed addict-2.4.0 mmcv-full-1.5.0 platformdirs-3.11.0 tomli-2.0.1 yapf-0.40.2 [notice] A new release of pip available: 22.3.1 -> 23.2.1 [notice] To update, run: python -m pip install --upgrade pip Successfully installed mmcv-full. load checkpoint from local path: hrnet_w48_coco_256x192-b9e0b3ab_20200708.pth Traceback (most recent call last): File "/home/user/app/app.py", line 25, in <module> pose_model = init_pose_model(pose_config, pose_checkpoint, device='cuda') File "/home/user/.local/lib/python3.9/site-packages/mmpose/apis/inference.py", line 45, in init_pose_model model.to(device) File "/home/user/.local/lib/python3.9/site-packages/torch/nn/modules/module.py", line 907, in to return self._apply(convert) File "/home/user/.local/lib/python3.9/site-packages/torch/nn/modules/module.py", line 578, in _apply module._apply(fn) File "/home/user/.local/lib/python3.9/site-packages/torch/nn/modules/module.py", line 578, in _apply module._apply(fn) File "/home/user/.local/lib/python3.9/site-packages/torch/nn/modules/module.py", line 601, in _apply param_applied = fn(param) File "/home/user/.local/lib/python3.9/site-packages/torch/nn/modules/module.py", line 905, in convert return t.to(device, dtype if t.is_floating_point() or t.is_complex() else None, non_blocking) File "/home/user/.local/lib/python3.9/site-packages/torch/cuda/__init__.py", line 216, in _lazy_init torch._C._cuda_init() RuntimeError: Found no NVIDIA driver on your system. Please check that you have an NVIDIA GPU and installed a driver from http://www.nvidia.com/Download/index.aspx
Container logs:
Fetching error logs...