Accuracy Drop
#2
by
mhyatt000
- opened
I tried to reproduce the results mentioned on this model card. Seems like my results do not match the claimed accuracy in the model card. I cannot figure out how to get the correct numbers, can you help me find my mistake?
- Claimed accuracy
- top 1: 81.8
- top 5: 95.6
- Received accuracy
- top 1: 80.1
- top 5: 93.6
Here are the details for my validation:
- I instantiate pre-trained model with
transformers.pipeline()
and use measure top 1 / top 5 accuracy - Evaluation was performed on CPU.
- Dataset was downloaded from image-net.org