question_classif_distil_bert / special_tokens_map.json

Commit History

add tokenizer
9d4f9c8
verified

Veekah commited on