Trouble connecting to ONNX

#21
by shaunak404 - opened

I am trying to convert the model to onnx using the optimum-cli using:
optimum-cli export onnx --model meta-llama/Llama-3-7B /Llama-3-7B

but get the following error:

ValueError: rope_scaling must be a dictionary with two fields, type and factor, got {'factor': 32.0, 'high_freq_factor': 4.0, 'low_freq_factor': 1.0, 'original_max_position_embeddings': 8192, 'rope_type': 'llama3'}

i looked at a previous that suggesting upgrading transformers to a certain version to get rid of the dependancy conflicts.However trying everything on that thread nothing worked and i still get the above error

I tried the same command with llama2-7b and it works alrights without any package upgrades of any sort

Sign up or log in to comment