runtime error
The cache for model files in Transformers v4.22.0 has been updated. Migrating your old cache. This is a one-time only operation. You can interrupt this and resume the migration later on by calling `transformers.utils.move_cache()`. 0it [00:00, ?it/s][A 0it [00:00, ?it/s] /usr/local/lib/python3.10/site-packages/huggingface_hub/file_download.py:1132: FutureWarning: `resume_download` is deprecated and will be removed in version 1.0.0. Downloads always resume when possible. If you want to force a new download, use `force_download=True`. warnings.warn( Traceback (most recent call last): File "/home/user/app/app.py", line 36, in <module> pipe = OVStableDiffusionPipeline.from_pretrained(model_id, compile = False, ov_config = {"CACHE_DIR":""}) File "/usr/local/lib/python3.10/site-packages/optimum/modeling_base.py", line 372, in from_pretrained return from_pretrained_method( File "/usr/local/lib/python3.10/site-packages/optimum/intel/openvino/modeling_diffusion.py", line 257, in _from_pretrained unet = cls.load_model( File "/usr/local/lib/python3.10/site-packages/optimum/intel/openvino/modeling_base.py", line 126, in load_model model = core.read_model(file_name) if not file_name.suffix == ".onnx" else convert_model(file_name) File "/usr/local/lib/python3.10/site-packages/openvino/runtime/ie_api.py", line 507, in read_model return Model(super().read_model(model)) RuntimeError: Exception from src/inference/src/core.cpp:99: Exception from src/inference/src/model_reader.cpp:137: Unable to read the model: /home/user/.cache/huggingface/hub/models--rubbrband--realDream_11/snapshots/77890b2fcb89ed358406e6247298de78a32b5adb/unet/openvino_model.xml Please check that model format: xml is supported and the model is correct. Available frontends: ir onnx paddle pytorch tf tflite
Container logs:
Fetching error logs...