singletongue's picture
Updates incorrect tokenizer configuration file (#2)
7650a07 verified
raw
history blame contribute delete
120 Bytes
{"do_lower_case": false, "word_tokenizer_type": "mecab", "subword_tokenizer_type": "character", "model_max_length": 512}