Unable to test inference on SageMaker
Hi there,
I'm new to HF and tried to deploy and test LLava-mistral in Amazon SageMaker. I deployed mode from model card just pasting the code in SageMaker notebook and seams everything passed well. Then I tried to test model inference from SageMaker Studio following this doc: https://docs.aws.amazon.com/sagemaker/latest/dg/realtime-endpoints-test-endpoints.html
So I put content type as application/json and for the body:
{
"image" : "https://www.ikea.com/pl/pl/images/products/silvtjaern-pojemnik__1150132_pe884373_s5.jpg?f=xl",
"question" : "Describe this image"
}
I've got the following error:
eceived client error (400) from primary with message "{
"code": 400,
"type": "InternalServerException",
"message": "The checkpoint you are trying to load has model type llava_next
but Transformers does not recognize this architecture. This could be because of an issue with the checkpoint, or because your version of Transformers is out of date."
}
Since I'm new to LLava and as I saw LLava is not availabe from JumpStarts models in SageMaker, I only have ability to enter body of the request, not to make request using Python SDK.
- So can I acctually test model just entering body in Test Inference option in SageMaker?
- Also, can someone help me with Python SDK code (using boto3 client) that I can invoke from let's say Lambda function to test inference. The inference task would be to submit an image URL and to let model describe me items in the image.
Many thanks
Aleksandar