runtime error
Traceback (most recent call last): File "/home/user/app/app.py", line 6, in <module> config = AutoConfig.from_pretrained("meta-llama/Meta-Llama-3.1-8B") File "/usr/local/lib/python3.10/site-packages/transformers/models/auto/configuration_auto.py", line 989, in from_pretrained return config_class.from_dict(config_dict, **unused_kwargs) File "/usr/local/lib/python3.10/site-packages/transformers/configuration_utils.py", line 772, in from_dict config = cls(**config_dict) File "/usr/local/lib/python3.10/site-packages/transformers/models/llama/configuration_llama.py", line 161, in __init__ self._rope_scaling_validation() File "/usr/local/lib/python3.10/site-packages/transformers/models/llama/configuration_llama.py", line 182, in _rope_scaling_validation raise ValueError( ValueError: `rope_scaling` must be a dictionary with two fields, `type` and `factor`, got {'factor': 8.0, 'low_freq_factor': 1.0, 'high_freq_factor': 4.0, 'original_max_position_embeddings': 8192, 'rope_type': 'llama3'}
Container logs:
Fetching error logs...