decapoda-research-llama-7B-hf / generation_config.json
dustydecapod's picture
update generation config
5f98eef
raw
history blame
124 Bytes
{"_from_model_config": true, "bos_token_id": 0, "eos_token_id": 1, "pad_token_id": 0, "transformers_version": "4.27.0.dev0"}