transformers AutoModelForCausalLM, AutoTokenizer usage problem.

#15
by sleepcat - opened

when used with AutoModelForCausalLM.from_pretrained("downloaded-model-folder-address") and AutoTokenizer.from_pretrained(), with the local address, it wouldn't use the downloaded model, but it still downloads a model's .safetensor files.

Meta Llama org

Can you share the exact command and logs of what you are running please?

Sign up or log in to comment