error
HTTPError Traceback (most recent call last)
/usr/local/lib/python3.10/dist-packages/huggingface_hub/utils/_errors.py in hf_raise_for_status(response, endpoint_name)
303 try:
--> 304 response.raise_for_status()
305 except HTTPError as e:
18 frames
HTTPError: 403 Client Error: Forbidden for url: https://huggingface.co/meta-llama/Llama-3.2-11B-Vision/resolve/main/config.json
The above exception was the direct cause of the following exception:
HfHubHTTPError Traceback (most recent call last)
HfHubHTTPError: (Request ID: Root=1-66f9889d-263aa8652ab496095c72336f;af631b09-05ed-4ad3-852d-05a3ae253a17)
403 Forbidden: Please enable access to public gated repositories in your fine-grained token settings to view this repository..
Cannot access content at: https://huggingface.co/meta-llama/Llama-3.2-11B-Vision/resolve/main/config.json.
If you are trying to create or update content, make sure you have a token with the write
role.
The above exception was the direct cause of the following exception:
LocalEntryNotFoundError Traceback (most recent call last)
LocalEntryNotFoundError: An error happened while trying to locate the file on the Hub and we cannot find the requested files in the local cache. Please check your connection and try again or make sure your Internet connection is on.
The above exception was the direct cause of the following exception:
OSError Traceback (most recent call last)
/usr/local/lib/python3.10/dist-packages/transformers/utils/hub.py in cached_file(path_or_repo_id, filename, cache_dir, force_download, resume_download, proxies, token, revision, local_files_only, subfolder, repo_type, user_agent, _raise_exceptions_for_gated_repo, _raise_exceptions_for_missing_entries, _raise_exceptions_for_connection_errors, _commit_hash, **deprecated_kwargs)
444 ):
445 return resolved_file
--> 446 raise EnvironmentError(
447 f"We couldn't connect to '{HUGGINGFACE_CO_RESOLVE_ENDPOINT}' to load this file, couldn't find it in the"
448 f" cached files and it looks like {path_or_repo_id} is not the path to a directory containing a file named"
OSError: We couldn't connect to 'https://huggingface.co' to load this file, couldn't find it in the cached files and it looks like meta-llama/Llama-3.2-11B-Vision is not the path to a directory containing a file named config.json.
Checkout your internet connection or see how to run the library in offline mode at 'https://huggingface.co/docs/transformers/installation#offline-mode'.
Can you please make sure you have access to the models (request it from the model page) and then authenticate with a fresh HF token please?
I also have the same question?
I have a qualified access_token. Is there any solution to this problem?
Hello,
I have obtained an access token to meta-llama/Llama-3.2-1B, which should apply to all the 3.2 model family, right?
Well, it only works for vision-less models, not meta-llama/Llama-3.2-11B-Vision-Instruct
. A form is open in this case, but I cannot complete it.
Can you double-check if there is any bug in the web form?
This solved the issue:
- request access to llama repo from hugging face
- generate access token and toggle "Read access to contents of all public gated repos you can access" in the access token permission
- login using access token in the python script
from huggingface_hub import login
login(token = <your_access_token>)
This solved the issue:
- request access to llama repo from hugging face
- generate access token and toggle "Read access to contents of all public gated repos you can access" in the access token permission
- login using access token in the python script
from huggingface_hub import login login(token = <your_access_token>)
It works well. Thank you for you help.