runtime error

Exit code: 1. Reason: WARNING 10-10 08:10:05 _custom_ops.py:18] Failed to import from vllm._C with ImportError('libcuda.so.1: cannot open shared object file: No such file or directory') Token is valid (permission: write). Your token has been saved in your configured git credential helpers (store). Your token has been saved to /home/user/.cache/huggingface/token Login successful Traceback (most recent call last): File "/home/user/app/app.py", line 20, in <module> "meta-llama": LLM(model="meta-llama/Meta-Llama-3.1-8B"), File "/usr/local/lib/python3.10/site-packages/vllm/entrypoints/llm.py", line 214, in __init__ self.llm_engine = LLMEngine.from_engine_args( File "/usr/local/lib/python3.10/site-packages/vllm/engine/llm_engine.py", line 561, in from_engine_args engine_config = engine_args.create_engine_config() File "/usr/local/lib/python3.10/site-packages/vllm/engine/arg_utils.py", line 873, in create_engine_config device_config = DeviceConfig(device=self.device) File "/usr/local/lib/python3.10/site-packages/vllm/config.py", line 1081, in __init__ raise RuntimeError("Failed to infer device type") RuntimeError: Failed to infer device type

Container logs:

Fetching error logs...