Make the tokenizer config match that of the large and base versions
#11
by
bryant1410
- opened
Hey, I just realized the tokenizer config is different from the large and base versions. I guess it's an error and the one I'm sending here is the correct one? I'm not sure.
To add more useful info, the model_max_length
doesn't match the field max_position_embeddings
(512) from the file config.json
.
intfloat
changed pull request status to
merged