roberta-base-squad2 / tokenizer_config.json
Branden Chan
Update to v2
deedc3e
raw
history blame contribute delete
79 Bytes
{"do_lower_case": false, "model_max_length": 512, "full_tokenizer_file": null}