runtime error
Exit code: 1. Reason: /usr/local/lib/python3.10/site-packages/transformers/tokenization_utils_base.py:1601: FutureWarning: `clean_up_tokenization_spaces` was not set. It will be set to `True` by default. This behavior will be depracted in transformers v4.45, and will be then set to `False` by default. For more details check this issue: https://github.com/huggingface/transformers/issues/31884 warnings.warn( Traceback (most recent call last): File "/home/user/app/app.py", line 7, in <module> pipe = pipeline("text-generation", model="Wonder-Griffin/TraXLMistral") File "/usr/local/lib/python3.10/site-packages/transformers/pipelines/__init__.py", line 895, in pipeline framework, model = infer_framework_load_model( File "/usr/local/lib/python3.10/site-packages/transformers/pipelines/base.py", line 286, in infer_framework_load_model model = model_class.from_pretrained(model, **kwargs) File "/usr/local/lib/python3.10/site-packages/transformers/models/auto/auto_factory.py", line 564, in from_pretrained return model_class.from_pretrained( File "/usr/local/lib/python3.10/site-packages/transformers/modeling_utils.py", line 3960, in from_pretrained ) = cls._load_pretrained_model( File "/usr/local/lib/python3.10/site-packages/transformers/modeling_utils.py", line 4492, in _load_pretrained_model raise RuntimeError(f"Error(s) in loading state_dict for {model.__class__.__name__}:\n\t{error_msg}") RuntimeError: Error(s) in loading state_dict for GPT2LMHeadModel: size mismatch for lm_head.weight: copying a param with shape torch.Size([50257, 128]) from checkpoint, the shape in current model is torch.Size([50257, 768]). You may consider adding `ignore_mismatched_sizes=True` in the model `from_pretrained` method.
Container logs:
Fetching error logs...