Microsoft MVP성태의 닷넷 이야기
글쓴 사람
정성태 (seongtaejeong at gmail.com)
홈페이지
첨부 파일
 

Hugging Face 모델 다운로드 시 "requests.exceptions.HTTPError: 401 Client Error: Unauthorized for url: ..." 오류

예를 들어 아래의 명령어를 실행했는데,

olive auto-opt --model_name_or_path meta-llama/Llama-3.2-1B-Instruct --trust_remote_code --output_path models/llama --device cpu --provider CPUExecutionProvider --use_ort_genai --precision int4 --log_level 1

이런 오류가 발생한다면?

Loading HuggingFace model from meta-llama/Llama-3.2-1B-Instruct
[...23:11:24,904] [INFO] [run.py:142:run_engine] Running workflow default_workflow
[...23:11:24,907] [INFO] [cache.py:138:__init__] Using cache directory: C:\foundry_cache\.olive-cache\default_workflow
[...23:11:24,910] [INFO] [accelerator_creator.py:217:create_accelerators] Running workflow on accelerator specs: cpu-cpu
[...23:11:24,913] [INFO] [engine.py:223:run] Running Olive on accelerator: cpu-cpu
[...23:11:24,914] [INFO] [engine.py:864:_create_system] Creating target system ...
[...23:11:24,914] [INFO] [engine.py:867:_create_system] Target system created in 0.000000 seconds
[...23:11:24,915] [INFO] [engine.py:879:_create_system] Creating host system ...
[...23:11:24,915] [INFO] [engine.py:882:_create_system] Host system created in 0.000000 seconds
[...23:11:27,564] [INFO] [engine.py:683:_run_pass] Running pass conversion:onnxconversion
[...23:11:27,806] [ERROR] [engine.py:752:_run_pass] Pass run failed.
Traceback (most recent call last):
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\huggingface_hub\utils\_http.py", line 409, in hf_raise_for_status
    response.raise_for_status()
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\requests\models.py", line 1026, in raise_for_status
    raise HTTPError(http_error_msg, response=self)
requests.exceptions.HTTPError: 401 Client Error: Unauthorized for url: https://huggingface.co/meta-llama/Llama-3.2-1B-Instruct/resolve/main/config.json

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\transformers\utils\hub.py", line 470, in cached_files
    hf_hub_download(
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\huggingface_hub\utils\_validators.py", line 114, in _inner_fn
    return fn(*args, **kwargs)
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\huggingface_hub\file_download.py", line 1008, in hf_hub_download
    return _hf_hub_download_to_cache_dir(
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\huggingface_hub\file_download.py", line 1115, in _hf_hub_download_to_cache_dir
    _raise_on_head_call_error(head_call_error, force_download, local_files_only)
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\huggingface_hub\file_download.py", line 1645, in _raise_on_head_call_error
    raise head_call_error
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\huggingface_hub\file_download.py", line 1533, in _get_metadata_or_catch_error
    metadata = get_hf_file_metadata(
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\huggingface_hub\utils\_validators.py", line 114, in _inner_fn
    return fn(*args, **kwargs)
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\huggingface_hub\file_download.py", line 1450, in get_hf_file_metadata
    r = _request_wrapper(
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\huggingface_hub\file_download.py", line 286, in _request_wrapper
    response = _request_wrapper(
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\huggingface_hub\file_download.py", line 310, in _request_wrapper
    hf_raise_for_status(response)
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\huggingface_hub\utils\_http.py", line 426, in hf_raise_for_status
    raise _format(GatedRepoError, message, response) from e
huggingface_hub.errors.GatedRepoError: 401 Client Error. (Request ID: Root=1-6850260f-2937b0d24ae1a494677199c0;72657359-ba7a-455c-b388-dccc23a6d25c)

Cannot access gated repo for url https://huggingface.co/meta-llama/Llama-3.2-1B-Instruct/resolve/main/config.json.
Access to model meta-llama/Llama-3.2-1B-Instruct is restricted. You must have access to it and be authenticated to access it. Please log in.

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\olive\engine\engine.py", line 740, in _run_pass
    output_model_config = host.run_pass(p, input_model_config, output_model_path)
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\olive\systems\local.py", line 28, in run_pass
    model = model_config.create_model()
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\olive\model\config\model_config.py", line 42, in create_model
    return cls(**self.config)
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\olive\model\handler\hf.py", line 57, in __init__
    self.model_attributes = {**self.get_hf_model_config().to_dict(), **(self.model_attributes or {})}
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\olive\model\handler\mixin\hf.py", line 42, in get_hf_model_config
    return get_model_config(self.model_path, **self.get_load_kwargs(exclude_load_keys))
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\olive\common\hf\utils.py", line 90, in get_model_config
    model_config = from_pretrained(AutoConfig, model_name_or_path, "config", **kwargs)
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\olive\common\hf\utils.py", line 85, in from_pretrained
    return cls.from_pretrained(get_pretrained_name_or_path(model_name_or_path, mlflow_dir), **kwargs)
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\transformers\models\auto\configuration_auto.py", line 1153, in from_pretrained
    config_dict, unused_kwargs = PretrainedConfig.get_config_dict(pretrained_model_name_or_path, **kwargs)
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\transformers\configuration_utils.py", line 595, in get_config_dict
    config_dict, kwargs = cls._get_config_dict(pretrained_model_name_or_path, **kwargs)
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\transformers\configuration_utils.py", line 654, in _get_config_dict
    resolved_config_file = cached_file(
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\transformers\utils\hub.py", line 312, in cached_file
    file = cached_files(path_or_repo_id=path_or_repo_id, filenames=[filename], **kwargs)
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\transformers\utils\hub.py", line 533, in cached_files
    raise OSError(
OSError: You are trying to access a gated repo.
Make sure to have access to it at https://huggingface.co/meta-llama/Llama-3.2-1B-Instruct.
401 Client Error. (Request ID: Root=1-6850260f-2937b0d24ae1a494677199c0;72657359-ba7a-455c-b388-dccc23a6d25c)

Cannot access gated repo for url https://huggingface.co/meta-llama/Llama-3.2-1B-Instruct/resolve/main/config.json.
Access to model meta-llama/Llama-3.2-1B-Instruct is restricted. You must have access to it and be authenticated to access it. Please log in.
[...23:11:27,815] [WARNING] [engine.py:317:run_accelerator] Failed to run Olive on cpu-cpu.
Traceback (most recent call last):
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\huggingface_hub\utils\_http.py", line 409, in hf_raise_for_status
    response.raise_for_status()
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\requests\models.py", line 1026, in raise_for_status
    raise HTTPError(http_error_msg, response=self)
requests.exceptions.HTTPError: 401 Client Error: Unauthorized for url: https://huggingface.co/meta-llama/Llama-3.2-1B-Instruct/resolve/main/config.json

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\transformers\utils\hub.py", line 470, in cached_files
    hf_hub_download(
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\huggingface_hub\utils\_validators.py", line 114, in _inner_fn
    return fn(*args, **kwargs)
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\huggingface_hub\file_download.py", line 1008, in hf_hub_download
    return _hf_hub_download_to_cache_dir(
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\huggingface_hub\file_download.py", line 1115, in _hf_hub_download_to_cache_dir
    _raise_on_head_call_error(head_call_error, force_download, local_files_only)
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\huggingface_hub\file_download.py", line 1645, in _raise_on_head_call_error
    raise head_call_error
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\huggingface_hub\file_download.py", line 1533, in _get_metadata_or_catch_error
    metadata = get_hf_file_metadata(
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\huggingface_hub\utils\_validators.py", line 114, in _inner_fn
    return fn(*args, **kwargs)
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\huggingface_hub\file_download.py", line 1450, in get_hf_file_metadata
    r = _request_wrapper(
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\huggingface_hub\file_download.py", line 286, in _request_wrapper
    response = _request_wrapper(
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\huggingface_hub\file_download.py", line 310, in _request_wrapper
    hf_raise_for_status(response)
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\huggingface_hub\utils\_http.py", line 426, in hf_raise_for_status
    raise _format(GatedRepoError, message, response) from e
huggingface_hub.errors.GatedRepoError: 401 Client Error. (Request ID: Root=1-6850260f-2937b0d24ae1a494677199c0;72657359-ba7a-455c-b388-dccc23a6d25c)

Cannot access gated repo for url https://huggingface.co/meta-llama/Llama-3.2-1B-Instruct/resolve/main/config.json.
Access to model meta-llama/Llama-3.2-1B-Instruct is restricted. You must have access to it and be authenticated to access it. Please log in.

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\olive\engine\engine.py", line 313, in run_accelerator
    output_footprint = self._run_no_search(input_model_config, input_model_id, accelerator_spec, output_dir)
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\olive\engine\engine.py", line 356, in _run_no_search
    should_prune, signal, model_ids = self._run_passes(input_model_config, input_model_id, accelerator_spec)
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\olive\engine\engine.py", line 639, in _run_passes
    model_config, model_id = self._run_pass(
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\olive\engine\engine.py", line 740, in _run_pass
    output_model_config = host.run_pass(p, input_model_config, output_model_path)
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\olive\systems\local.py", line 28, in run_pass
    model = model_config.create_model()
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\olive\model\config\model_config.py", line 42, in create_model
    return cls(**self.config)
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\olive\model\handler\hf.py", line 57, in __init__
    self.model_attributes = {**self.get_hf_model_config().to_dict(), **(self.model_attributes or {})}
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\olive\model\handler\mixin\hf.py", line 42, in get_hf_model_config
    return get_model_config(self.model_path, **self.get_load_kwargs(exclude_load_keys))
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\olive\common\hf\utils.py", line 90, in get_model_config
    model_config = from_pretrained(AutoConfig, model_name_or_path, "config", **kwargs)
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\olive\common\hf\utils.py", line 85, in from_pretrained
    return cls.from_pretrained(get_pretrained_name_or_path(model_name_or_path, mlflow_dir), **kwargs)
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\transformers\models\auto\configuration_auto.py", line 1153, in from_pretrained
    config_dict, unused_kwargs = PretrainedConfig.get_config_dict(pretrained_model_name_or_path, **kwargs)
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\transformers\configuration_utils.py", line 595, in get_config_dict
    config_dict, kwargs = cls._get_config_dict(pretrained_model_name_or_path, **kwargs)
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\transformers\configuration_utils.py", line 654, in _get_config_dict
    resolved_config_file = cached_file(
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\transformers\utils\hub.py", line 312, in cached_file
    file = cached_files(path_or_repo_id=path_or_repo_id, filenames=[filename], **kwargs)
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\transformers\utils\hub.py", line 533, in cached_files
    raise OSError(
OSError: You are trying to access a gated repo.
Make sure to have access to it at https://huggingface.co/meta-llama/Llama-3.2-1B-Instruct.
401 Client Error. (Request ID: Root=1-6850260f-2937b0d24ae1a494677199c0;72657359-ba7a-455c-b388-dccc23a6d25c)

Cannot access gated repo for url https://huggingface.co/meta-llama/Llama-3.2-1B-Instruct/resolve/main/config.json.
Access to model meta-llama/Llama-3.2-1B-Instruct is restricted. You must have access to it and be authenticated to access it. Please log in.
[...23:11:27,817] [INFO] [engine.py:241:run] Run history for cpu-cpu:
[...23:11:27,818] [INFO] [engine.py:499:dump_run_history] Please install tabulate for better run history output

메시지에 나온 링크를 방문하면,

https://huggingface.co/meta-llama/Llama-3.2-1B-Instruct/resolve/main/config.json

아래와 같이 원인을 알려줍니다.

Access to model meta-llama/Llama-3.2-1B-Instruct is restricted and you are not in the authorized list. Visit https://huggingface.co/meta-llama/Llama-3.2-1B-Instruct to ask for access.

즉, 저런 식의 오류가 발생하는 경우에는 해당 모델 설명 페이지에 가서 접근 권한을 요청해야 합니다. 실제로 저 링크의 페이지를 방문하면 아래와 같이 "Expand to review and access" 버튼이 나오는데요,

hf_request_model_access_1.png

(HuggingFace에 로그인한 후) 저걸 펼쳐서 약간의 정보를 입력하고 "Submit" 버튼을 누르는 절차를 거쳐야 합니다. 이후, 따로 승인 절차는 없고 곧바로 액세스가 허용됩니다.




만약 그래도 이런 오류가 발생한다면?

Loading HuggingFace model from meta-llama/Llama-3.2-1B-Instruct
[...23:24:18,060] [INFO] [run.py:142:run_engine] Running workflow default_workflow
[...23:24:18,063] [INFO] [cache.py:138:__init__] Using cache directory: C:\foundry_cache\.olive-cache\default_workflow
[...23:24:18,067] [INFO] [accelerator_creator.py:217:create_accelerators] Running workflow on accelerator specs: cpu-cpu
[...23:24:18,070] [INFO] [engine.py:223:run] Running Olive on accelerator: cpu-cpu
[...23:24:18,070] [INFO] [engine.py:864:_create_system] Creating target system ...
[...23:24:18,071] [INFO] [engine.py:867:_create_system] Target system created in 0.000000 seconds
[...23:24:18,071] [INFO] [engine.py:879:_create_system] Creating host system ...
[...23:24:18,071] [INFO] [engine.py:882:_create_system] Host system created in 0.000000 seconds
[...23:24:20,772] [INFO] [engine.py:683:_run_pass] Running pass conversion:onnxconversion
[...23:24:20,989] [ERROR] [engine.py:752:_run_pass] Pass run failed.
Traceback (most recent call last):
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\huggingface_hub\utils\_http.py", line 409, in hf_raise_for_status
    response.raise_for_status()
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\requests\models.py", line 1026, in raise_for_status
    raise HTTPError(http_error_msg, response=self)
requests.exceptions.HTTPError: 401 Client Error: Unauthorized for url: https://huggingface.co/meta-llama/Llama-3.2-1B-Instruct/resolve/main/config.json

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\transformers\utils\hub.py", line 470, in cached_files
    hf_hub_download(
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\huggingface_hub\utils\_validators.py", line 114, in _inner_fn
    return fn(*args, **kwargs)
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\huggingface_hub\file_download.py", line 1008, in hf_hub_download
    return _hf_hub_download_to_cache_dir(
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\huggingface_hub\file_download.py", line 1115, in _hf_hub_download_to_cache_dir
    _raise_on_head_call_error(head_call_error, force_download, local_files_only)
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\huggingface_hub\file_download.py", line 1645, in _raise_on_head_call_error
    raise head_call_error
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\huggingface_hub\file_download.py", line 1533, in _get_metadata_or_catch_error
    metadata = get_hf_file_metadata(
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\huggingface_hub\utils\_validators.py", line 114, in _inner_fn
    return fn(*args, **kwargs)
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\huggingface_hub\file_download.py", line 1450, in get_hf_file_metadata
    r = _request_wrapper(
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\huggingface_hub\file_download.py", line 286, in _request_wrapper
    response = _request_wrapper(
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\huggingface_hub\file_download.py", line 310, in _request_wrapper
    hf_raise_for_status(response)
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\huggingface_hub\utils\_http.py", line 426, in hf_raise_for_status
    raise _format(GatedRepoError, message, response) from e
huggingface_hub.errors.GatedRepoError: 401 Client Error. (Request ID: Root=1-68502914-33ecb01b288368837fb1bb32;1903ba42-0113-4f37-919f-d540dbe3e225)

Cannot access gated repo for url https://huggingface.co/meta-llama/Llama-3.2-1B-Instruct/resolve/main/config.json.
Access to model meta-llama/Llama-3.2-1B-Instruct is restricted. You must have access to it and be authenticated to access it. Please log in.

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\olive\engine\engine.py", line 740, in _run_pass
    output_model_config = host.run_pass(p, input_model_config, output_model_path)
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\olive\systems\local.py", line 28, in run_pass
    model = model_config.create_model()
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\olive\model\config\model_config.py", line 42, in create_model
    return cls(**self.config)
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\olive\model\handler\hf.py", line 57, in __init__
    self.model_attributes = {**self.get_hf_model_config().to_dict(), **(self.model_attributes or {})}
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\olive\model\handler\mixin\hf.py", line 42, in get_hf_model_config
    return get_model_config(self.model_path, **self.get_load_kwargs(exclude_load_keys))
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\olive\common\hf\utils.py", line 90, in get_model_config
    model_config = from_pretrained(AutoConfig, model_name_or_path, "config", **kwargs)
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\olive\common\hf\utils.py", line 85, in from_pretrained
    return cls.from_pretrained(get_pretrained_name_or_path(model_name_or_path, mlflow_dir), **kwargs)
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\transformers\models\auto\configuration_auto.py", line 1153, in from_pretrained
    config_dict, unused_kwargs = PretrainedConfig.get_config_dict(pretrained_model_name_or_path, **kwargs)
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\transformers\configuration_utils.py", line 595, in get_config_dict
    config_dict, kwargs = cls._get_config_dict(pretrained_model_name_or_path, **kwargs)
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\transformers\configuration_utils.py", line 654, in _get_config_dict
    resolved_config_file = cached_file(
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\transformers\utils\hub.py", line 312, in cached_file
    file = cached_files(path_or_repo_id=path_or_repo_id, filenames=[filename], **kwargs)
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\transformers\utils\hub.py", line 533, in cached_files
    raise OSError(
OSError: You are trying to access a gated repo.
Make sure to have access to it at https://huggingface.co/meta-llama/Llama-3.2-1B-Instruct.
401 Client Error. (Request ID: Root=1-68502914-33ecb01b288368837fb1bb32;1903ba42-0113-4f37-919f-d540dbe3e225)

Cannot access gated repo for url https://huggingface.co/meta-llama/Llama-3.2-1B-Instruct/resolve/main/config.json.
Access to model meta-llama/Llama-3.2-1B-Instruct is restricted. You must have access to it and be authenticated to access it. Please log in.
[...23:24:20,997] [WARNING] [engine.py:317:run_accelerator] Failed to run Olive on cpu-cpu.
Traceback (most recent call last):
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\huggingface_hub\utils\_http.py", line 409, in hf_raise_for_status
    response.raise_for_status()
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\requests\models.py", line 1026, in raise_for_status
    raise HTTPError(http_error_msg, response=self)
requests.exceptions.HTTPError: 401 Client Error: Unauthorized for url: https://huggingface.co/meta-llama/Llama-3.2-1B-Instruct/resolve/main/config.json

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\transformers\utils\hub.py", line 470, in cached_files
    hf_hub_download(
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\huggingface_hub\utils\_validators.py", line 114, in _inner_fn
    return fn(*args, **kwargs)
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\huggingface_hub\file_download.py", line 1008, in hf_hub_download
    return _hf_hub_download_to_cache_dir(
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\huggingface_hub\file_download.py", line 1115, in _hf_hub_download_to_cache_dir
    _raise_on_head_call_error(head_call_error, force_download, local_files_only)
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\huggingface_hub\file_download.py", line 1645, in _raise_on_head_call_error
    raise head_call_error
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\huggingface_hub\file_download.py", line 1533, in _get_metadata_or_catch_error
    metadata = get_hf_file_metadata(
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\huggingface_hub\utils\_validators.py", line 114, in _inner_fn
    return fn(*args, **kwargs)
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\huggingface_hub\file_download.py", line 1450, in get_hf_file_metadata
    r = _request_wrapper(
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\huggingface_hub\file_download.py", line 286, in _request_wrapper
    response = _request_wrapper(
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\huggingface_hub\file_download.py", line 310, in _request_wrapper
    hf_raise_for_status(response)
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\huggingface_hub\utils\_http.py", line 426, in hf_raise_for_status
    raise _format(GatedRepoError, message, response) from e
huggingface_hub.errors.GatedRepoError: 401 Client Error. (Request ID: Root=1-68502914-33ecb01b288368837fb1bb32;1903ba42-0113-4f37-919f-d540dbe3e225)

Cannot access gated repo for url https://huggingface.co/meta-llama/Llama-3.2-1B-Instruct/resolve/main/config.json.
Access to model meta-llama/Llama-3.2-1B-Instruct is restricted. You must have access to it and be authenticated to access it. Please log in.

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\olive\engine\engine.py", line 313, in run_accelerator
    output_footprint = self._run_no_search(input_model_config, input_model_id, accelerator_spec, output_dir)
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\olive\engine\engine.py", line 356, in _run_no_search
    should_prune, signal, model_ids = self._run_passes(input_model_config, input_model_id, accelerator_spec)
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\olive\engine\engine.py", line 639, in _run_passes
    model_config, model_id = self._run_pass(
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\olive\engine\engine.py", line 740, in _run_pass
    output_model_config = host.run_pass(p, input_model_config, output_model_path)
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\olive\systems\local.py", line 28, in run_pass
    model = model_config.create_model()
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\olive\model\config\model_config.py", line 42, in create_model
    return cls(**self.config)
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\olive\model\handler\hf.py", line 57, in __init__
    self.model_attributes = {**self.get_hf_model_config().to_dict(), **(self.model_attributes or {})}
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\olive\model\handler\mixin\hf.py", line 42, in get_hf_model_config
    return get_model_config(self.model_path, **self.get_load_kwargs(exclude_load_keys))
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\olive\common\hf\utils.py", line 90, in get_model_config
    model_config = from_pretrained(AutoConfig, model_name_or_path, "config", **kwargs)
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\olive\common\hf\utils.py", line 85, in from_pretrained
    return cls.from_pretrained(get_pretrained_name_or_path(model_name_or_path, mlflow_dir), **kwargs)
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\transformers\models\auto\configuration_auto.py", line 1153, in from_pretrained
    config_dict, unused_kwargs = PretrainedConfig.get_config_dict(pretrained_model_name_or_path, **kwargs)
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\transformers\configuration_utils.py", line 595, in get_config_dict
    config_dict, kwargs = cls._get_config_dict(pretrained_model_name_or_path, **kwargs)
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\transformers\configuration_utils.py", line 654, in _get_config_dict
    resolved_config_file = cached_file(
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\transformers\utils\hub.py", line 312, in cached_file
    file = cached_files(path_or_repo_id=path_or_repo_id, filenames=[filename], **kwargs)
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\transformers\utils\hub.py", line 533, in cached_files
    raise OSError(
OSError: You are trying to access a gated repo.
Make sure to have access to it at https://huggingface.co/meta-llama/Llama-3.2-1B-Instruct.
401 Client Error. (Request ID: Root=1-68502914-33ecb01b288368837fb1bb32;1903ba42-0113-4f37-919f-d540dbe3e225)

Cannot access gated repo for url https://huggingface.co/meta-llama/Llama-3.2-1B-Instruct/resolve/main/config.json.
Access to model meta-llama/Llama-3.2-1B-Instruct is restricted. You must have access to it and be authenticated to access it. Please log in.

명령어를 실행하기 전 hugging face login을 하지 않았기 때문입니다. ^^ 따라서 "huggingface-cli login" 명령어를 실행해,

(huggingface-build) c:\temp> huggingface-cli login

    _|    _|  _|    _|    _|_|_|    _|_|_|  _|_|_|  _|      _|    _|_|_|      _|_|_|_|    _|_|      _|_|_|  _|_|_|_|
    _|    _|  _|    _|  _|        _|          _|    _|_|    _|  _|            _|        _|    _|  _|        _|
    _|_|_|_|  _|    _|  _|  _|_|  _|  _|_|    _|    _|  _|  _|  _|  _|_|      _|_|_|    _|_|_|_|  _|        _|_|_|
    _|    _|  _|    _|  _|    _|  _|    _|    _|    _|    _|_|  _|    _|      _|        _|    _|  _|        _|
    _|    _|    _|_|      _|_|_|    _|_|_|  _|_|_|  _|      _|    _|_|_|      _|        _|    _|    _|_|_|  _|_|_|_|

    To log in, `huggingface_hub` requires a token generated from https://huggingface.co/settings/tokens .
Token can be pasted using 'Right-Click'.
Enter your token (input will not be visible):
Add token as git credential? (Y/n) y
Token is valid (permission: read).
The token `test_token` has been saved to c:\temp\python310\cache\huggingface\stored_tokens
Your token has been saved in your configured git credential helpers (manager).
Your token has been saved to c:\temp\python310\cache\huggingface\token
Login successful.
The current active token is: `test_token`

Hugging Face 사이트를 통해 발급받은 User access token을 입력해 진행하면 됩니다. 위와 같은 경우 이후로 cache에 저장되므로 다시 로그인할 필요가 없습니다.




[이 글에 대해서 여러분들과 의견을 공유하고 싶습니다. 틀리거나 미흡한 부분 또는 의문 사항이 있으시면 언제든 댓글 남겨주십시오.]







[최초 등록일: ]
[최종 수정일: 6/19/2025]

Creative Commons License
이 저작물은 크리에이티브 커먼즈 코리아 저작자표시-비영리-변경금지 2.0 대한민국 라이센스에 따라 이용하실 수 있습니다.
by SeongTae Jeong, mailto:techsharer at outlook.com

비밀번호

댓글 작성자
 




1  2  3  4  5  6  7  8  [9]  10  11  12  13  14  15  ...
NoWriterDateCnt.TitleFile(s)
13756정성태10/8/20247186오류 유형: 925. ssh로 docker 접근을 할 때 "... malformed HTTP status code ..." 오류 발생
13755정성태10/7/20247742닷넷: 2305. C# 13 - (9) 메서드 바인딩의 우선순위를 지정하는 OverloadResolutionPriority 특성 도입 (Overload resolution priority)파일 다운로드1
13754정성태10/4/20246819닷넷: 2304. C# 13 - (8) 부분 메서드 정의를 속성 및 인덱서에도 확대파일 다운로드1
13753정성태10/4/20246466Linux: 81. Linux - PATH 환경변수의 적용 규칙
13752정성태10/2/20247765닷넷: 2303. C# 13 - (7) ref struct의 interface 상속 및 제네릭 제약으로 사용 가능 [6]파일 다운로드1
13751정성태10/2/20246308C/C++: 176. C/C++ - ARM64로 포팅할 때 유의할 점
13750정성태10/1/20246087C/C++: 175. C++ - WinMain/wWinMain 호출 전의 CRT 초기화 단계
13749정성태9/30/20246313닷넷: 2302. C# - ssh-keygen으로 생성한 Private Key와 Public Key 연동파일 다운로드1
13748정성태9/29/20246822닷넷: 2301. C# - BigInteger 타입이 byte 배열로 직렬화하는 방식
13747정성태9/28/20247351닷넷: 2300. C# - OpenSSH의 공개키 파일에 대한 "BEGIN OPENSSH PUBLIC KEY" / "END OPENSSH PUBLIC KEY" PEM 포맷파일 다운로드1
13746정성태9/28/20246583오류 유형: 924. Python - LocalProtocolError("Illegal header value ...")
13745정성태9/28/20246458Linux: 80. 리눅스 - 실행 중인 프로세스 내부의 환경변수 설정을 구하는 방법 (lldb)
13744정성태9/27/20247011닷넷: 2299. C# - Windows Hello 사용자 인증 다이얼로그 표시하기파일 다운로드1
13743정성태9/26/20247690닷넷: 2298. C# - Console 프로젝트에서의 await 대상으로 Main 스레드 활용하는 방법 [1]
13742정성태9/26/20247765닷넷: 2297. C# - ssh-keygen으로 생성한 ecdsa 유형의 Public Key 파일 해석 [1]파일 다운로드1
13741정성태9/25/20247077디버깅 기술: 202. windbg - ASP.NET MVC Web Application (.NET Framework) 응용 프로그램의 덤프 분석 시 요령
13740정성태9/24/20246686기타: 86. RSA 공개키 등의 modulus 값에 0x00 선행 바이트가 있는 이유(ASN.1 인코딩)
13739정성태9/24/20246957닷넷: 2297. C# - ssh-keygen으로 생성한 Public Key 파일 해석과 fingerprint 값(md5, sha256) 생성 [1]파일 다운로드1
13738정성태9/22/20246669C/C++: 174. C/C++ - 윈도우 운영체제에서의 file descriptor, FILE*파일 다운로드1
13737정성태9/21/20247165개발 환경 구성: 727. Visual C++ - 리눅스 프로젝트를 위한 빌드 서버의 msbuild 구성
13736정성태9/20/20247152오류 유형: 923. Visual Studio Code - Could not establish connection to "...": Port forwarding is disabled.
13735정성태9/20/20246923개발 환경 구성: 726. ARM 플랫폼용 Visual C++ 리눅스 프로젝트 빌드
13734정성태9/19/20246531개발 환경 구성: 725. ssh를 이용한 원격 docker 서비스 사용
13733정성태9/19/20247096VS.NET IDE: 194. Visual Studio - Cross Platform / "Authentication Type: Private Key"로 접속하는 방법
13732정성태9/17/20247309개발 환경 구성: 724. ARM + docker 환경에서 .NET 8 설치
13731정성태9/15/20247825개발 환경 구성: 723. C# / Visual C++ - Control Flow Guard (CFG) 활성화 [1]파일 다운로드2
1  2  3  4  5  6  7  8  [9]  10  11  12  13  14  15  ...