TypeError: 'tokenizers.AddedToken' object is not iterable
#3
by
djstrong
- opened
I am getting this error trying to run this model with https://github.com/EleutherAI/lm-evaluation-harness
Please pull the latest, the tokenizer fixes should have been pushed to the lm-eval harness
picocreator
changed discussion status to
closed