How to use this onnx model
#84
by
mkj69
- opened
Ask how to use this onnx model, and especially how to configure this tokenizer based on these json files?
I have not used onnx for this model before, but I have one function that works with other models.
The tokeizer are set up outside the function via tokenizer = AutoTokenizer.from_pretrained(model_name)
.
You might need to adjust the below function based on the model and your needs.
# Function to perform inference with ONNX
def onnx_inference(question, answer):
inputs = tokenizer(question, answer, return_tensors="pt")
input_names = ort_session.get_inputs()
inputs_onnx = {
input_name.name: inputs[input_name.name].numpy() for input_name in input_names
}
outputs = ort_session.run(None, inputs_onnx)
return outputs[0][0]