ValueError: `rope_scaling` must be a dictionary with with two fields

#25
by tianke0711 - opened

could you tell how to solve this issue:
pipeline = transformers.pipeline(
"text-generation", model=model_id, model_kwargs={"torch_dtype": torch.bfloat16}, device_map="auto"
)

ValueError: rope_scaling must be a dictionary with with two fields, type and factor, got {'factor': 8.0, 'low_freq_factor': 1.0, 'high_freq_factor': 4.0, 'original_max_position_embeddings': 8192, 'rope_type': 'llama3'}

pip install --upgrade transformers

I am hitting the same issue on a conda env with transformers== 4.41.2. What's the version suppose to be installed here.

@qtao913 It should be fixed for versions >=4.33

I want to know if anyone got a solution to this, I am facing same issue.

Meta Llama org

@aparikh did the suggestions from above not fix the issue?

Can you do a pip list and share your env details please?

@Sanyam
Hi, I am using Kaggle platform
and my transformer version is
transformers 4.45.2

Have the same issue, really wait someone to tell the solution. I've tried lots of transformers' versions, doesn't help

Sign up or log in to comment