Microsoft MVP성태의 닷넷 이야기
글쓴 사람
정성태 (seongtaejeong at gmail.com)
홈페이지
첨부 파일
 

Hugging Face 모델 다운로드 시 "requests.exceptions.HTTPError: 401 Client Error: Unauthorized for url: ..." 오류

예를 들어 아래의 명령어를 실행했는데,

olive auto-opt --model_name_or_path meta-llama/Llama-3.2-1B-Instruct --trust_remote_code --output_path models/llama --device cpu --provider CPUExecutionProvider --use_ort_genai --precision int4 --log_level 1

이런 오류가 발생한다면?

Loading HuggingFace model from meta-llama/Llama-3.2-1B-Instruct
[...23:11:24,904] [INFO] [run.py:142:run_engine] Running workflow default_workflow
[...23:11:24,907] [INFO] [cache.py:138:__init__] Using cache directory: C:\foundry_cache\.olive-cache\default_workflow
[...23:11:24,910] [INFO] [accelerator_creator.py:217:create_accelerators] Running workflow on accelerator specs: cpu-cpu
[...23:11:24,913] [INFO] [engine.py:223:run] Running Olive on accelerator: cpu-cpu
[...23:11:24,914] [INFO] [engine.py:864:_create_system] Creating target system ...
[...23:11:24,914] [INFO] [engine.py:867:_create_system] Target system created in 0.000000 seconds
[...23:11:24,915] [INFO] [engine.py:879:_create_system] Creating host system ...
[...23:11:24,915] [INFO] [engine.py:882:_create_system] Host system created in 0.000000 seconds
[...23:11:27,564] [INFO] [engine.py:683:_run_pass] Running pass conversion:onnxconversion
[...23:11:27,806] [ERROR] [engine.py:752:_run_pass] Pass run failed.
Traceback (most recent call last):
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\huggingface_hub\utils\_http.py", line 409, in hf_raise_for_status
    response.raise_for_status()
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\requests\models.py", line 1026, in raise_for_status
    raise HTTPError(http_error_msg, response=self)
requests.exceptions.HTTPError: 401 Client Error: Unauthorized for url: https://huggingface.co/meta-llama/Llama-3.2-1B-Instruct/resolve/main/config.json

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\transformers\utils\hub.py", line 470, in cached_files
    hf_hub_download(
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\huggingface_hub\utils\_validators.py", line 114, in _inner_fn
    return fn(*args, **kwargs)
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\huggingface_hub\file_download.py", line 1008, in hf_hub_download
    return _hf_hub_download_to_cache_dir(
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\huggingface_hub\file_download.py", line 1115, in _hf_hub_download_to_cache_dir
    _raise_on_head_call_error(head_call_error, force_download, local_files_only)
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\huggingface_hub\file_download.py", line 1645, in _raise_on_head_call_error
    raise head_call_error
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\huggingface_hub\file_download.py", line 1533, in _get_metadata_or_catch_error
    metadata = get_hf_file_metadata(
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\huggingface_hub\utils\_validators.py", line 114, in _inner_fn
    return fn(*args, **kwargs)
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\huggingface_hub\file_download.py", line 1450, in get_hf_file_metadata
    r = _request_wrapper(
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\huggingface_hub\file_download.py", line 286, in _request_wrapper
    response = _request_wrapper(
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\huggingface_hub\file_download.py", line 310, in _request_wrapper
    hf_raise_for_status(response)
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\huggingface_hub\utils\_http.py", line 426, in hf_raise_for_status
    raise _format(GatedRepoError, message, response) from e
huggingface_hub.errors.GatedRepoError: 401 Client Error. (Request ID: Root=1-6850260f-2937b0d24ae1a494677199c0;72657359-ba7a-455c-b388-dccc23a6d25c)

Cannot access gated repo for url https://huggingface.co/meta-llama/Llama-3.2-1B-Instruct/resolve/main/config.json.
Access to model meta-llama/Llama-3.2-1B-Instruct is restricted. You must have access to it and be authenticated to access it. Please log in.

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\olive\engine\engine.py", line 740, in _run_pass
    output_model_config = host.run_pass(p, input_model_config, output_model_path)
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\olive\systems\local.py", line 28, in run_pass
    model = model_config.create_model()
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\olive\model\config\model_config.py", line 42, in create_model
    return cls(**self.config)
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\olive\model\handler\hf.py", line 57, in __init__
    self.model_attributes = {**self.get_hf_model_config().to_dict(), **(self.model_attributes or {})}
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\olive\model\handler\mixin\hf.py", line 42, in get_hf_model_config
    return get_model_config(self.model_path, **self.get_load_kwargs(exclude_load_keys))
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\olive\common\hf\utils.py", line 90, in get_model_config
    model_config = from_pretrained(AutoConfig, model_name_or_path, "config", **kwargs)
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\olive\common\hf\utils.py", line 85, in from_pretrained
    return cls.from_pretrained(get_pretrained_name_or_path(model_name_or_path, mlflow_dir), **kwargs)
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\transformers\models\auto\configuration_auto.py", line 1153, in from_pretrained
    config_dict, unused_kwargs = PretrainedConfig.get_config_dict(pretrained_model_name_or_path, **kwargs)
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\transformers\configuration_utils.py", line 595, in get_config_dict
    config_dict, kwargs = cls._get_config_dict(pretrained_model_name_or_path, **kwargs)
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\transformers\configuration_utils.py", line 654, in _get_config_dict
    resolved_config_file = cached_file(
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\transformers\utils\hub.py", line 312, in cached_file
    file = cached_files(path_or_repo_id=path_or_repo_id, filenames=[filename], **kwargs)
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\transformers\utils\hub.py", line 533, in cached_files
    raise OSError(
OSError: You are trying to access a gated repo.
Make sure to have access to it at https://huggingface.co/meta-llama/Llama-3.2-1B-Instruct.
401 Client Error. (Request ID: Root=1-6850260f-2937b0d24ae1a494677199c0;72657359-ba7a-455c-b388-dccc23a6d25c)

Cannot access gated repo for url https://huggingface.co/meta-llama/Llama-3.2-1B-Instruct/resolve/main/config.json.
Access to model meta-llama/Llama-3.2-1B-Instruct is restricted. You must have access to it and be authenticated to access it. Please log in.
[...23:11:27,815] [WARNING] [engine.py:317:run_accelerator] Failed to run Olive on cpu-cpu.
Traceback (most recent call last):
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\huggingface_hub\utils\_http.py", line 409, in hf_raise_for_status
    response.raise_for_status()
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\requests\models.py", line 1026, in raise_for_status
    raise HTTPError(http_error_msg, response=self)
requests.exceptions.HTTPError: 401 Client Error: Unauthorized for url: https://huggingface.co/meta-llama/Llama-3.2-1B-Instruct/resolve/main/config.json

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\transformers\utils\hub.py", line 470, in cached_files
    hf_hub_download(
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\huggingface_hub\utils\_validators.py", line 114, in _inner_fn
    return fn(*args, **kwargs)
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\huggingface_hub\file_download.py", line 1008, in hf_hub_download
    return _hf_hub_download_to_cache_dir(
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\huggingface_hub\file_download.py", line 1115, in _hf_hub_download_to_cache_dir
    _raise_on_head_call_error(head_call_error, force_download, local_files_only)
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\huggingface_hub\file_download.py", line 1645, in _raise_on_head_call_error
    raise head_call_error
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\huggingface_hub\file_download.py", line 1533, in _get_metadata_or_catch_error
    metadata = get_hf_file_metadata(
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\huggingface_hub\utils\_validators.py", line 114, in _inner_fn
    return fn(*args, **kwargs)
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\huggingface_hub\file_download.py", line 1450, in get_hf_file_metadata
    r = _request_wrapper(
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\huggingface_hub\file_download.py", line 286, in _request_wrapper
    response = _request_wrapper(
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\huggingface_hub\file_download.py", line 310, in _request_wrapper
    hf_raise_for_status(response)
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\huggingface_hub\utils\_http.py", line 426, in hf_raise_for_status
    raise _format(GatedRepoError, message, response) from e
huggingface_hub.errors.GatedRepoError: 401 Client Error. (Request ID: Root=1-6850260f-2937b0d24ae1a494677199c0;72657359-ba7a-455c-b388-dccc23a6d25c)

Cannot access gated repo for url https://huggingface.co/meta-llama/Llama-3.2-1B-Instruct/resolve/main/config.json.
Access to model meta-llama/Llama-3.2-1B-Instruct is restricted. You must have access to it and be authenticated to access it. Please log in.

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\olive\engine\engine.py", line 313, in run_accelerator
    output_footprint = self._run_no_search(input_model_config, input_model_id, accelerator_spec, output_dir)
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\olive\engine\engine.py", line 356, in _run_no_search
    should_prune, signal, model_ids = self._run_passes(input_model_config, input_model_id, accelerator_spec)
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\olive\engine\engine.py", line 639, in _run_passes
    model_config, model_id = self._run_pass(
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\olive\engine\engine.py", line 740, in _run_pass
    output_model_config = host.run_pass(p, input_model_config, output_model_path)
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\olive\systems\local.py", line 28, in run_pass
    model = model_config.create_model()
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\olive\model\config\model_config.py", line 42, in create_model
    return cls(**self.config)
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\olive\model\handler\hf.py", line 57, in __init__
    self.model_attributes = {**self.get_hf_model_config().to_dict(), **(self.model_attributes or {})}
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\olive\model\handler\mixin\hf.py", line 42, in get_hf_model_config
    return get_model_config(self.model_path, **self.get_load_kwargs(exclude_load_keys))
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\olive\common\hf\utils.py", line 90, in get_model_config
    model_config = from_pretrained(AutoConfig, model_name_or_path, "config", **kwargs)
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\olive\common\hf\utils.py", line 85, in from_pretrained
    return cls.from_pretrained(get_pretrained_name_or_path(model_name_or_path, mlflow_dir), **kwargs)
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\transformers\models\auto\configuration_auto.py", line 1153, in from_pretrained
    config_dict, unused_kwargs = PretrainedConfig.get_config_dict(pretrained_model_name_or_path, **kwargs)
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\transformers\configuration_utils.py", line 595, in get_config_dict
    config_dict, kwargs = cls._get_config_dict(pretrained_model_name_or_path, **kwargs)
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\transformers\configuration_utils.py", line 654, in _get_config_dict
    resolved_config_file = cached_file(
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\transformers\utils\hub.py", line 312, in cached_file
    file = cached_files(path_or_repo_id=path_or_repo_id, filenames=[filename], **kwargs)
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\transformers\utils\hub.py", line 533, in cached_files
    raise OSError(
OSError: You are trying to access a gated repo.
Make sure to have access to it at https://huggingface.co/meta-llama/Llama-3.2-1B-Instruct.
401 Client Error. (Request ID: Root=1-6850260f-2937b0d24ae1a494677199c0;72657359-ba7a-455c-b388-dccc23a6d25c)

Cannot access gated repo for url https://huggingface.co/meta-llama/Llama-3.2-1B-Instruct/resolve/main/config.json.
Access to model meta-llama/Llama-3.2-1B-Instruct is restricted. You must have access to it and be authenticated to access it. Please log in.
[...23:11:27,817] [INFO] [engine.py:241:run] Run history for cpu-cpu:
[...23:11:27,818] [INFO] [engine.py:499:dump_run_history] Please install tabulate for better run history output

메시지에 나온 링크를 방문하면,

https://huggingface.co/meta-llama/Llama-3.2-1B-Instruct/resolve/main/config.json

아래와 같이 원인을 알려줍니다.

Access to model meta-llama/Llama-3.2-1B-Instruct is restricted and you are not in the authorized list. Visit https://huggingface.co/meta-llama/Llama-3.2-1B-Instruct to ask for access.

즉, 저런 식의 오류가 발생하는 경우에는 해당 모델 설명 페이지에 가서 접근 권한을 요청해야 합니다. 실제로 저 링크의 페이지를 방문하면 아래와 같이 "Expand to review and access" 버튼이 나오는데요,

hf_request_model_access_1.png

(HuggingFace에 로그인한 후) 저걸 펼쳐서 약간의 정보를 입력하고 "Submit" 버튼을 누르는 절차를 거쳐야 합니다. 이후, 따로 승인 절차는 없고 곧바로 액세스가 허용됩니다.




만약 그래도 이런 오류가 발생한다면?

Loading HuggingFace model from meta-llama/Llama-3.2-1B-Instruct
[...23:24:18,060] [INFO] [run.py:142:run_engine] Running workflow default_workflow
[...23:24:18,063] [INFO] [cache.py:138:__init__] Using cache directory: C:\foundry_cache\.olive-cache\default_workflow
[...23:24:18,067] [INFO] [accelerator_creator.py:217:create_accelerators] Running workflow on accelerator specs: cpu-cpu
[...23:24:18,070] [INFO] [engine.py:223:run] Running Olive on accelerator: cpu-cpu
[...23:24:18,070] [INFO] [engine.py:864:_create_system] Creating target system ...
[...23:24:18,071] [INFO] [engine.py:867:_create_system] Target system created in 0.000000 seconds
[...23:24:18,071] [INFO] [engine.py:879:_create_system] Creating host system ...
[...23:24:18,071] [INFO] [engine.py:882:_create_system] Host system created in 0.000000 seconds
[...23:24:20,772] [INFO] [engine.py:683:_run_pass] Running pass conversion:onnxconversion
[...23:24:20,989] [ERROR] [engine.py:752:_run_pass] Pass run failed.
Traceback (most recent call last):
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\huggingface_hub\utils\_http.py", line 409, in hf_raise_for_status
    response.raise_for_status()
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\requests\models.py", line 1026, in raise_for_status
    raise HTTPError(http_error_msg, response=self)
requests.exceptions.HTTPError: 401 Client Error: Unauthorized for url: https://huggingface.co/meta-llama/Llama-3.2-1B-Instruct/resolve/main/config.json

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\transformers\utils\hub.py", line 470, in cached_files
    hf_hub_download(
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\huggingface_hub\utils\_validators.py", line 114, in _inner_fn
    return fn(*args, **kwargs)
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\huggingface_hub\file_download.py", line 1008, in hf_hub_download
    return _hf_hub_download_to_cache_dir(
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\huggingface_hub\file_download.py", line 1115, in _hf_hub_download_to_cache_dir
    _raise_on_head_call_error(head_call_error, force_download, local_files_only)
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\huggingface_hub\file_download.py", line 1645, in _raise_on_head_call_error
    raise head_call_error
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\huggingface_hub\file_download.py", line 1533, in _get_metadata_or_catch_error
    metadata = get_hf_file_metadata(
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\huggingface_hub\utils\_validators.py", line 114, in _inner_fn
    return fn(*args, **kwargs)
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\huggingface_hub\file_download.py", line 1450, in get_hf_file_metadata
    r = _request_wrapper(
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\huggingface_hub\file_download.py", line 286, in _request_wrapper
    response = _request_wrapper(
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\huggingface_hub\file_download.py", line 310, in _request_wrapper
    hf_raise_for_status(response)
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\huggingface_hub\utils\_http.py", line 426, in hf_raise_for_status
    raise _format(GatedRepoError, message, response) from e
huggingface_hub.errors.GatedRepoError: 401 Client Error. (Request ID: Root=1-68502914-33ecb01b288368837fb1bb32;1903ba42-0113-4f37-919f-d540dbe3e225)

Cannot access gated repo for url https://huggingface.co/meta-llama/Llama-3.2-1B-Instruct/resolve/main/config.json.
Access to model meta-llama/Llama-3.2-1B-Instruct is restricted. You must have access to it and be authenticated to access it. Please log in.

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\olive\engine\engine.py", line 740, in _run_pass
    output_model_config = host.run_pass(p, input_model_config, output_model_path)
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\olive\systems\local.py", line 28, in run_pass
    model = model_config.create_model()
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\olive\model\config\model_config.py", line 42, in create_model
    return cls(**self.config)
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\olive\model\handler\hf.py", line 57, in __init__
    self.model_attributes = {**self.get_hf_model_config().to_dict(), **(self.model_attributes or {})}
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\olive\model\handler\mixin\hf.py", line 42, in get_hf_model_config
    return get_model_config(self.model_path, **self.get_load_kwargs(exclude_load_keys))
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\olive\common\hf\utils.py", line 90, in get_model_config
    model_config = from_pretrained(AutoConfig, model_name_or_path, "config", **kwargs)
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\olive\common\hf\utils.py", line 85, in from_pretrained
    return cls.from_pretrained(get_pretrained_name_or_path(model_name_or_path, mlflow_dir), **kwargs)
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\transformers\models\auto\configuration_auto.py", line 1153, in from_pretrained
    config_dict, unused_kwargs = PretrainedConfig.get_config_dict(pretrained_model_name_or_path, **kwargs)
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\transformers\configuration_utils.py", line 595, in get_config_dict
    config_dict, kwargs = cls._get_config_dict(pretrained_model_name_or_path, **kwargs)
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\transformers\configuration_utils.py", line 654, in _get_config_dict
    resolved_config_file = cached_file(
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\transformers\utils\hub.py", line 312, in cached_file
    file = cached_files(path_or_repo_id=path_or_repo_id, filenames=[filename], **kwargs)
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\transformers\utils\hub.py", line 533, in cached_files
    raise OSError(
OSError: You are trying to access a gated repo.
Make sure to have access to it at https://huggingface.co/meta-llama/Llama-3.2-1B-Instruct.
401 Client Error. (Request ID: Root=1-68502914-33ecb01b288368837fb1bb32;1903ba42-0113-4f37-919f-d540dbe3e225)

Cannot access gated repo for url https://huggingface.co/meta-llama/Llama-3.2-1B-Instruct/resolve/main/config.json.
Access to model meta-llama/Llama-3.2-1B-Instruct is restricted. You must have access to it and be authenticated to access it. Please log in.
[...23:24:20,997] [WARNING] [engine.py:317:run_accelerator] Failed to run Olive on cpu-cpu.
Traceback (most recent call last):
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\huggingface_hub\utils\_http.py", line 409, in hf_raise_for_status
    response.raise_for_status()
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\requests\models.py", line 1026, in raise_for_status
    raise HTTPError(http_error_msg, response=self)
requests.exceptions.HTTPError: 401 Client Error: Unauthorized for url: https://huggingface.co/meta-llama/Llama-3.2-1B-Instruct/resolve/main/config.json

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\transformers\utils\hub.py", line 470, in cached_files
    hf_hub_download(
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\huggingface_hub\utils\_validators.py", line 114, in _inner_fn
    return fn(*args, **kwargs)
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\huggingface_hub\file_download.py", line 1008, in hf_hub_download
    return _hf_hub_download_to_cache_dir(
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\huggingface_hub\file_download.py", line 1115, in _hf_hub_download_to_cache_dir
    _raise_on_head_call_error(head_call_error, force_download, local_files_only)
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\huggingface_hub\file_download.py", line 1645, in _raise_on_head_call_error
    raise head_call_error
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\huggingface_hub\file_download.py", line 1533, in _get_metadata_or_catch_error
    metadata = get_hf_file_metadata(
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\huggingface_hub\utils\_validators.py", line 114, in _inner_fn
    return fn(*args, **kwargs)
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\huggingface_hub\file_download.py", line 1450, in get_hf_file_metadata
    r = _request_wrapper(
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\huggingface_hub\file_download.py", line 286, in _request_wrapper
    response = _request_wrapper(
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\huggingface_hub\file_download.py", line 310, in _request_wrapper
    hf_raise_for_status(response)
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\huggingface_hub\utils\_http.py", line 426, in hf_raise_for_status
    raise _format(GatedRepoError, message, response) from e
huggingface_hub.errors.GatedRepoError: 401 Client Error. (Request ID: Root=1-68502914-33ecb01b288368837fb1bb32;1903ba42-0113-4f37-919f-d540dbe3e225)

Cannot access gated repo for url https://huggingface.co/meta-llama/Llama-3.2-1B-Instruct/resolve/main/config.json.
Access to model meta-llama/Llama-3.2-1B-Instruct is restricted. You must have access to it and be authenticated to access it. Please log in.

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\olive\engine\engine.py", line 313, in run_accelerator
    output_footprint = self._run_no_search(input_model_config, input_model_id, accelerator_spec, output_dir)
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\olive\engine\engine.py", line 356, in _run_no_search
    should_prune, signal, model_ids = self._run_passes(input_model_config, input_model_id, accelerator_spec)
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\olive\engine\engine.py", line 639, in _run_passes
    model_config, model_id = self._run_pass(
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\olive\engine\engine.py", line 740, in _run_pass
    output_model_config = host.run_pass(p, input_model_config, output_model_path)
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\olive\systems\local.py", line 28, in run_pass
    model = model_config.create_model()
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\olive\model\config\model_config.py", line 42, in create_model
    return cls(**self.config)
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\olive\model\handler\hf.py", line 57, in __init__
    self.model_attributes = {**self.get_hf_model_config().to_dict(), **(self.model_attributes or {})}
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\olive\model\handler\mixin\hf.py", line 42, in get_hf_model_config
    return get_model_config(self.model_path, **self.get_load_kwargs(exclude_load_keys))
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\olive\common\hf\utils.py", line 90, in get_model_config
    model_config = from_pretrained(AutoConfig, model_name_or_path, "config", **kwargs)
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\olive\common\hf\utils.py", line 85, in from_pretrained
    return cls.from_pretrained(get_pretrained_name_or_path(model_name_or_path, mlflow_dir), **kwargs)
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\transformers\models\auto\configuration_auto.py", line 1153, in from_pretrained
    config_dict, unused_kwargs = PretrainedConfig.get_config_dict(pretrained_model_name_or_path, **kwargs)
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\transformers\configuration_utils.py", line 595, in get_config_dict
    config_dict, kwargs = cls._get_config_dict(pretrained_model_name_or_path, **kwargs)
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\transformers\configuration_utils.py", line 654, in _get_config_dict
    resolved_config_file = cached_file(
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\transformers\utils\hub.py", line 312, in cached_file
    file = cached_files(path_or_repo_id=path_or_repo_id, filenames=[filename], **kwargs)
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\transformers\utils\hub.py", line 533, in cached_files
    raise OSError(
OSError: You are trying to access a gated repo.
Make sure to have access to it at https://huggingface.co/meta-llama/Llama-3.2-1B-Instruct.
401 Client Error. (Request ID: Root=1-68502914-33ecb01b288368837fb1bb32;1903ba42-0113-4f37-919f-d540dbe3e225)

Cannot access gated repo for url https://huggingface.co/meta-llama/Llama-3.2-1B-Instruct/resolve/main/config.json.
Access to model meta-llama/Llama-3.2-1B-Instruct is restricted. You must have access to it and be authenticated to access it. Please log in.

명령어를 실행하기 전 hugging face login을 하지 않았기 때문입니다. ^^ 따라서 "huggingface-cli login" 명령어를 실행해,

(huggingface-build) c:\temp> huggingface-cli login

    _|    _|  _|    _|    _|_|_|    _|_|_|  _|_|_|  _|      _|    _|_|_|      _|_|_|_|    _|_|      _|_|_|  _|_|_|_|
    _|    _|  _|    _|  _|        _|          _|    _|_|    _|  _|            _|        _|    _|  _|        _|
    _|_|_|_|  _|    _|  _|  _|_|  _|  _|_|    _|    _|  _|  _|  _|  _|_|      _|_|_|    _|_|_|_|  _|        _|_|_|
    _|    _|  _|    _|  _|    _|  _|    _|    _|    _|    _|_|  _|    _|      _|        _|    _|  _|        _|
    _|    _|    _|_|      _|_|_|    _|_|_|  _|_|_|  _|      _|    _|_|_|      _|        _|    _|    _|_|_|  _|_|_|_|

    To log in, `huggingface_hub` requires a token generated from https://huggingface.co/settings/tokens .
Token can be pasted using 'Right-Click'.
Enter your token (input will not be visible):
Add token as git credential? (Y/n) y
Token is valid (permission: read).
The token `test_token` has been saved to c:\temp\python310\cache\huggingface\stored_tokens
Your token has been saved in your configured git credential helpers (manager).
Your token has been saved to c:\temp\python310\cache\huggingface\token
Login successful.
The current active token is: `test_token`

Hugging Face 사이트를 통해 발급받은 User access token을 입력해 진행하면 됩니다. 위와 같은 경우 이후로 cache에 저장되므로 다시 로그인할 필요가 없습니다.




[이 글에 대해서 여러분들과 의견을 공유하고 싶습니다. 틀리거나 미흡한 부분 또는 의문 사항이 있으시면 언제든 댓글 남겨주십시오.]







[최초 등록일: ]
[최종 수정일: 6/19/2025]

Creative Commons License
이 저작물은 크리에이티브 커먼즈 코리아 저작자표시-비영리-변경금지 2.0 대한민국 라이센스에 따라 이용하실 수 있습니다.
by SeongTae Jeong, mailto:techsharer at outlook.com

비밀번호

댓글 작성자
 




1  2  3  4  5  6  [7]  8  9  10  11  12  13  14  15  ...
NoWriterDateCnt.TitleFile(s)
13806정성태11/8/20245924Windows: 270. 어댑터 상세 정보(Network Connection Details) 창의 내용이 비어 있는 경우
13805정성태11/8/20245536오류 유형: 931. Active Directory의 adprep 또는 복제가 안 되는 경우
13804정성태11/7/20247189Linux: 101. eBPF 함수의 인자를 다루는 방법
13803정성태11/7/20246724닷넷: 2309. C# - .NET Core에서 바뀐 DateTime.Ticks의 정밀도
13802정성태11/6/20247195Windows: 269. GetSystemTimeAsFileTime과 GetSystemTimePreciseAsFileTime의 차이점파일 다운로드1
13801정성태11/5/20246968Linux: 100. eBPF의 2가지 방식 - libbcc와 libbpf(CO-RE)
13800정성태11/3/20247941닷넷: 2308. C# - ICU 라이브러리를 활용한 문자열의 대소문자 변환 [2]파일 다운로드1
13799정성태11/2/20245693개발 환경 구성: 732. 모바일 웹 브라우저에서 유니코드 문자가 표시되지 않는 경우
13798정성태11/2/20247352개발 환경 구성: 731. 유니코드 - 출력 예시 및 폰트 찾기
13797정성태11/1/20247305C/C++: 185. C++ - 문자열의 대소문자를 변환하는 transform + std::tolower/toupper 방식의 문제점파일 다운로드1
13796정성태10/31/20246860C/C++: 184. C++ - ICU dll을 이용하는 예제 코드 (Windows)파일 다운로드1
13795정성태10/31/20246011Windows: 268. Windows - 리눅스 환경처럼 공백으로 끝나는 프롬프트 만들기
13794정성태10/30/20246122닷넷: 2307. C# - 윈도우에서 한글(및 유니코드)을 포함한 콘솔 프로그램을 컴파일 및 실행하는 방법
13793정성태10/28/20246043C/C++: 183. C++ - 윈도우에서 한글(및 유니코드)을 포함한 콘솔 프로그램을 컴파일 및 실행하는 방법
13792정성태10/27/20245507Linux: 99. Linux - 프로세스의 실행 파일 경로 확인
13791정성태10/27/20245938Windows: 267. Win32 API의 A(ANSI) 버전은 DBCS를 사용할까요?파일 다운로드1
13790정성태10/27/20245405Linux: 98. Ubuntu 22.04 - 리눅스 커널 빌드 및 업그레이드
13789정성태10/27/20246379Linux: 97. menuconfig에 CONFIG_DEBUG_INFO_BTF, CONFIG_DEBUG_INFO_BTF_MODULES 옵션이 없는 경우
13788정성태10/26/20245165Linux: 96. eBPF (bpf2go) - fentry, fexit를 이용한 트레이스
13787정성태10/26/20246519개발 환경 구성: 730. github - Linux 커널 repo를 윈도우 환경에서 git clone하는 방법 [1]
13786정성태10/26/20246338Windows: 266. Windows - 대소문자 구분이 가능한 파일 시스템
13785정성태10/23/20245649C/C++: 182. 윈도우가 운영하는 2개의 Code Page파일 다운로드1
13784정성태10/23/20246295Linux: 95. eBPF - kprobe를 이용한 트레이스
13783정성태10/23/20245800Linux: 94. eBPF - vmlinux.h 헤더 포함하는 방법 (bpf2go에서 사용)
13782정성태10/23/20245492Linux: 93. Ubuntu 22.04 - 커널 이미지로부터 커널 함수 역어셈블
13781정성태10/22/20246041오류 유형: 930. WSL + eBPF: modprobe: FATAL: Module kheaders not found in directory
1  2  3  4  5  6  [7]  8  9  10  11  12  13  14  15  ...