gpt2_1_2M_100eps / tokenizer_config.json
tanaymehta's picture
add tokenizer
52015f2 verified
raw
history blame contribute delete
26 Bytes
{"model_max_length": 1024}