Microsoft MVP성태의 닷넷 이야기
글쓴 사람
정성태 (seongtaejeong at gmail.com)
홈페이지
첨부 파일
 

Hugging Face 모델 다운로드 시 "requests.exceptions.HTTPError: 401 Client Error: Unauthorized for url: ..." 오류

예를 들어 아래의 명령어를 실행했는데,

olive auto-opt --model_name_or_path meta-llama/Llama-3.2-1B-Instruct --trust_remote_code --output_path models/llama --device cpu --provider CPUExecutionProvider --use_ort_genai --precision int4 --log_level 1

이런 오류가 발생한다면?

Loading HuggingFace model from meta-llama/Llama-3.2-1B-Instruct
[...23:11:24,904] [INFO] [run.py:142:run_engine] Running workflow default_workflow
[...23:11:24,907] [INFO] [cache.py:138:__init__] Using cache directory: C:\foundry_cache\.olive-cache\default_workflow
[...23:11:24,910] [INFO] [accelerator_creator.py:217:create_accelerators] Running workflow on accelerator specs: cpu-cpu
[...23:11:24,913] [INFO] [engine.py:223:run] Running Olive on accelerator: cpu-cpu
[...23:11:24,914] [INFO] [engine.py:864:_create_system] Creating target system ...
[...23:11:24,914] [INFO] [engine.py:867:_create_system] Target system created in 0.000000 seconds
[...23:11:24,915] [INFO] [engine.py:879:_create_system] Creating host system ...
[...23:11:24,915] [INFO] [engine.py:882:_create_system] Host system created in 0.000000 seconds
[...23:11:27,564] [INFO] [engine.py:683:_run_pass] Running pass conversion:onnxconversion
[...23:11:27,806] [ERROR] [engine.py:752:_run_pass] Pass run failed.
Traceback (most recent call last):
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\huggingface_hub\utils\_http.py", line 409, in hf_raise_for_status
    response.raise_for_status()
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\requests\models.py", line 1026, in raise_for_status
    raise HTTPError(http_error_msg, response=self)
requests.exceptions.HTTPError: 401 Client Error: Unauthorized for url: https://huggingface.co/meta-llama/Llama-3.2-1B-Instruct/resolve/main/config.json

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\transformers\utils\hub.py", line 470, in cached_files
    hf_hub_download(
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\huggingface_hub\utils\_validators.py", line 114, in _inner_fn
    return fn(*args, **kwargs)
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\huggingface_hub\file_download.py", line 1008, in hf_hub_download
    return _hf_hub_download_to_cache_dir(
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\huggingface_hub\file_download.py", line 1115, in _hf_hub_download_to_cache_dir
    _raise_on_head_call_error(head_call_error, force_download, local_files_only)
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\huggingface_hub\file_download.py", line 1645, in _raise_on_head_call_error
    raise head_call_error
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\huggingface_hub\file_download.py", line 1533, in _get_metadata_or_catch_error
    metadata = get_hf_file_metadata(
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\huggingface_hub\utils\_validators.py", line 114, in _inner_fn
    return fn(*args, **kwargs)
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\huggingface_hub\file_download.py", line 1450, in get_hf_file_metadata
    r = _request_wrapper(
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\huggingface_hub\file_download.py", line 286, in _request_wrapper
    response = _request_wrapper(
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\huggingface_hub\file_download.py", line 310, in _request_wrapper
    hf_raise_for_status(response)
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\huggingface_hub\utils\_http.py", line 426, in hf_raise_for_status
    raise _format(GatedRepoError, message, response) from e
huggingface_hub.errors.GatedRepoError: 401 Client Error. (Request ID: Root=1-6850260f-2937b0d24ae1a494677199c0;72657359-ba7a-455c-b388-dccc23a6d25c)

Cannot access gated repo for url https://huggingface.co/meta-llama/Llama-3.2-1B-Instruct/resolve/main/config.json.
Access to model meta-llama/Llama-3.2-1B-Instruct is restricted. You must have access to it and be authenticated to access it. Please log in.

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\olive\engine\engine.py", line 740, in _run_pass
    output_model_config = host.run_pass(p, input_model_config, output_model_path)
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\olive\systems\local.py", line 28, in run_pass
    model = model_config.create_model()
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\olive\model\config\model_config.py", line 42, in create_model
    return cls(**self.config)
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\olive\model\handler\hf.py", line 57, in __init__
    self.model_attributes = {**self.get_hf_model_config().to_dict(), **(self.model_attributes or {})}
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\olive\model\handler\mixin\hf.py", line 42, in get_hf_model_config
    return get_model_config(self.model_path, **self.get_load_kwargs(exclude_load_keys))
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\olive\common\hf\utils.py", line 90, in get_model_config
    model_config = from_pretrained(AutoConfig, model_name_or_path, "config", **kwargs)
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\olive\common\hf\utils.py", line 85, in from_pretrained
    return cls.from_pretrained(get_pretrained_name_or_path(model_name_or_path, mlflow_dir), **kwargs)
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\transformers\models\auto\configuration_auto.py", line 1153, in from_pretrained
    config_dict, unused_kwargs = PretrainedConfig.get_config_dict(pretrained_model_name_or_path, **kwargs)
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\transformers\configuration_utils.py", line 595, in get_config_dict
    config_dict, kwargs = cls._get_config_dict(pretrained_model_name_or_path, **kwargs)
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\transformers\configuration_utils.py", line 654, in _get_config_dict
    resolved_config_file = cached_file(
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\transformers\utils\hub.py", line 312, in cached_file
    file = cached_files(path_or_repo_id=path_or_repo_id, filenames=[filename], **kwargs)
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\transformers\utils\hub.py", line 533, in cached_files
    raise OSError(
OSError: You are trying to access a gated repo.
Make sure to have access to it at https://huggingface.co/meta-llama/Llama-3.2-1B-Instruct.
401 Client Error. (Request ID: Root=1-6850260f-2937b0d24ae1a494677199c0;72657359-ba7a-455c-b388-dccc23a6d25c)

Cannot access gated repo for url https://huggingface.co/meta-llama/Llama-3.2-1B-Instruct/resolve/main/config.json.
Access to model meta-llama/Llama-3.2-1B-Instruct is restricted. You must have access to it and be authenticated to access it. Please log in.
[...23:11:27,815] [WARNING] [engine.py:317:run_accelerator] Failed to run Olive on cpu-cpu.
Traceback (most recent call last):
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\huggingface_hub\utils\_http.py", line 409, in hf_raise_for_status
    response.raise_for_status()
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\requests\models.py", line 1026, in raise_for_status
    raise HTTPError(http_error_msg, response=self)
requests.exceptions.HTTPError: 401 Client Error: Unauthorized for url: https://huggingface.co/meta-llama/Llama-3.2-1B-Instruct/resolve/main/config.json

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\transformers\utils\hub.py", line 470, in cached_files
    hf_hub_download(
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\huggingface_hub\utils\_validators.py", line 114, in _inner_fn
    return fn(*args, **kwargs)
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\huggingface_hub\file_download.py", line 1008, in hf_hub_download
    return _hf_hub_download_to_cache_dir(
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\huggingface_hub\file_download.py", line 1115, in _hf_hub_download_to_cache_dir
    _raise_on_head_call_error(head_call_error, force_download, local_files_only)
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\huggingface_hub\file_download.py", line 1645, in _raise_on_head_call_error
    raise head_call_error
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\huggingface_hub\file_download.py", line 1533, in _get_metadata_or_catch_error
    metadata = get_hf_file_metadata(
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\huggingface_hub\utils\_validators.py", line 114, in _inner_fn
    return fn(*args, **kwargs)
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\huggingface_hub\file_download.py", line 1450, in get_hf_file_metadata
    r = _request_wrapper(
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\huggingface_hub\file_download.py", line 286, in _request_wrapper
    response = _request_wrapper(
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\huggingface_hub\file_download.py", line 310, in _request_wrapper
    hf_raise_for_status(response)
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\huggingface_hub\utils\_http.py", line 426, in hf_raise_for_status
    raise _format(GatedRepoError, message, response) from e
huggingface_hub.errors.GatedRepoError: 401 Client Error. (Request ID: Root=1-6850260f-2937b0d24ae1a494677199c0;72657359-ba7a-455c-b388-dccc23a6d25c)

Cannot access gated repo for url https://huggingface.co/meta-llama/Llama-3.2-1B-Instruct/resolve/main/config.json.
Access to model meta-llama/Llama-3.2-1B-Instruct is restricted. You must have access to it and be authenticated to access it. Please log in.

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\olive\engine\engine.py", line 313, in run_accelerator
    output_footprint = self._run_no_search(input_model_config, input_model_id, accelerator_spec, output_dir)
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\olive\engine\engine.py", line 356, in _run_no_search
    should_prune, signal, model_ids = self._run_passes(input_model_config, input_model_id, accelerator_spec)
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\olive\engine\engine.py", line 639, in _run_passes
    model_config, model_id = self._run_pass(
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\olive\engine\engine.py", line 740, in _run_pass
    output_model_config = host.run_pass(p, input_model_config, output_model_path)
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\olive\systems\local.py", line 28, in run_pass
    model = model_config.create_model()
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\olive\model\config\model_config.py", line 42, in create_model
    return cls(**self.config)
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\olive\model\handler\hf.py", line 57, in __init__
    self.model_attributes = {**self.get_hf_model_config().to_dict(), **(self.model_attributes or {})}
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\olive\model\handler\mixin\hf.py", line 42, in get_hf_model_config
    return get_model_config(self.model_path, **self.get_load_kwargs(exclude_load_keys))
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\olive\common\hf\utils.py", line 90, in get_model_config
    model_config = from_pretrained(AutoConfig, model_name_or_path, "config", **kwargs)
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\olive\common\hf\utils.py", line 85, in from_pretrained
    return cls.from_pretrained(get_pretrained_name_or_path(model_name_or_path, mlflow_dir), **kwargs)
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\transformers\models\auto\configuration_auto.py", line 1153, in from_pretrained
    config_dict, unused_kwargs = PretrainedConfig.get_config_dict(pretrained_model_name_or_path, **kwargs)
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\transformers\configuration_utils.py", line 595, in get_config_dict
    config_dict, kwargs = cls._get_config_dict(pretrained_model_name_or_path, **kwargs)
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\transformers\configuration_utils.py", line 654, in _get_config_dict
    resolved_config_file = cached_file(
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\transformers\utils\hub.py", line 312, in cached_file
    file = cached_files(path_or_repo_id=path_or_repo_id, filenames=[filename], **kwargs)
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\transformers\utils\hub.py", line 533, in cached_files
    raise OSError(
OSError: You are trying to access a gated repo.
Make sure to have access to it at https://huggingface.co/meta-llama/Llama-3.2-1B-Instruct.
401 Client Error. (Request ID: Root=1-6850260f-2937b0d24ae1a494677199c0;72657359-ba7a-455c-b388-dccc23a6d25c)

Cannot access gated repo for url https://huggingface.co/meta-llama/Llama-3.2-1B-Instruct/resolve/main/config.json.
Access to model meta-llama/Llama-3.2-1B-Instruct is restricted. You must have access to it and be authenticated to access it. Please log in.
[...23:11:27,817] [INFO] [engine.py:241:run] Run history for cpu-cpu:
[...23:11:27,818] [INFO] [engine.py:499:dump_run_history] Please install tabulate for better run history output

메시지에 나온 링크를 방문하면,

https://huggingface.co/meta-llama/Llama-3.2-1B-Instruct/resolve/main/config.json

아래와 같이 원인을 알려줍니다.

Access to model meta-llama/Llama-3.2-1B-Instruct is restricted and you are not in the authorized list. Visit https://huggingface.co/meta-llama/Llama-3.2-1B-Instruct to ask for access.

즉, 저런 식의 오류가 발생하는 경우에는 해당 모델 설명 페이지에 가서 접근 권한을 요청해야 합니다. 실제로 저 링크의 페이지를 방문하면 아래와 같이 "Expand to review and access" 버튼이 나오는데요,

hf_request_model_access_1.png

(HuggingFace에 로그인한 후) 저걸 펼쳐서 약간의 정보를 입력하고 "Submit" 버튼을 누르는 절차를 거쳐야 합니다. 이후, 따로 승인 절차는 없고 곧바로 액세스가 허용됩니다.




만약 그래도 이런 오류가 발생한다면?

Loading HuggingFace model from meta-llama/Llama-3.2-1B-Instruct
[...23:24:18,060] [INFO] [run.py:142:run_engine] Running workflow default_workflow
[...23:24:18,063] [INFO] [cache.py:138:__init__] Using cache directory: C:\foundry_cache\.olive-cache\default_workflow
[...23:24:18,067] [INFO] [accelerator_creator.py:217:create_accelerators] Running workflow on accelerator specs: cpu-cpu
[...23:24:18,070] [INFO] [engine.py:223:run] Running Olive on accelerator: cpu-cpu
[...23:24:18,070] [INFO] [engine.py:864:_create_system] Creating target system ...
[...23:24:18,071] [INFO] [engine.py:867:_create_system] Target system created in 0.000000 seconds
[...23:24:18,071] [INFO] [engine.py:879:_create_system] Creating host system ...
[...23:24:18,071] [INFO] [engine.py:882:_create_system] Host system created in 0.000000 seconds
[...23:24:20,772] [INFO] [engine.py:683:_run_pass] Running pass conversion:onnxconversion
[...23:24:20,989] [ERROR] [engine.py:752:_run_pass] Pass run failed.
Traceback (most recent call last):
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\huggingface_hub\utils\_http.py", line 409, in hf_raise_for_status
    response.raise_for_status()
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\requests\models.py", line 1026, in raise_for_status
    raise HTTPError(http_error_msg, response=self)
requests.exceptions.HTTPError: 401 Client Error: Unauthorized for url: https://huggingface.co/meta-llama/Llama-3.2-1B-Instruct/resolve/main/config.json

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\transformers\utils\hub.py", line 470, in cached_files
    hf_hub_download(
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\huggingface_hub\utils\_validators.py", line 114, in _inner_fn
    return fn(*args, **kwargs)
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\huggingface_hub\file_download.py", line 1008, in hf_hub_download
    return _hf_hub_download_to_cache_dir(
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\huggingface_hub\file_download.py", line 1115, in _hf_hub_download_to_cache_dir
    _raise_on_head_call_error(head_call_error, force_download, local_files_only)
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\huggingface_hub\file_download.py", line 1645, in _raise_on_head_call_error
    raise head_call_error
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\huggingface_hub\file_download.py", line 1533, in _get_metadata_or_catch_error
    metadata = get_hf_file_metadata(
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\huggingface_hub\utils\_validators.py", line 114, in _inner_fn
    return fn(*args, **kwargs)
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\huggingface_hub\file_download.py", line 1450, in get_hf_file_metadata
    r = _request_wrapper(
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\huggingface_hub\file_download.py", line 286, in _request_wrapper
    response = _request_wrapper(
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\huggingface_hub\file_download.py", line 310, in _request_wrapper
    hf_raise_for_status(response)
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\huggingface_hub\utils\_http.py", line 426, in hf_raise_for_status
    raise _format(GatedRepoError, message, response) from e
huggingface_hub.errors.GatedRepoError: 401 Client Error. (Request ID: Root=1-68502914-33ecb01b288368837fb1bb32;1903ba42-0113-4f37-919f-d540dbe3e225)

Cannot access gated repo for url https://huggingface.co/meta-llama/Llama-3.2-1B-Instruct/resolve/main/config.json.
Access to model meta-llama/Llama-3.2-1B-Instruct is restricted. You must have access to it and be authenticated to access it. Please log in.

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\olive\engine\engine.py", line 740, in _run_pass
    output_model_config = host.run_pass(p, input_model_config, output_model_path)
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\olive\systems\local.py", line 28, in run_pass
    model = model_config.create_model()
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\olive\model\config\model_config.py", line 42, in create_model
    return cls(**self.config)
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\olive\model\handler\hf.py", line 57, in __init__
    self.model_attributes = {**self.get_hf_model_config().to_dict(), **(self.model_attributes or {})}
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\olive\model\handler\mixin\hf.py", line 42, in get_hf_model_config
    return get_model_config(self.model_path, **self.get_load_kwargs(exclude_load_keys))
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\olive\common\hf\utils.py", line 90, in get_model_config
    model_config = from_pretrained(AutoConfig, model_name_or_path, "config", **kwargs)
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\olive\common\hf\utils.py", line 85, in from_pretrained
    return cls.from_pretrained(get_pretrained_name_or_path(model_name_or_path, mlflow_dir), **kwargs)
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\transformers\models\auto\configuration_auto.py", line 1153, in from_pretrained
    config_dict, unused_kwargs = PretrainedConfig.get_config_dict(pretrained_model_name_or_path, **kwargs)
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\transformers\configuration_utils.py", line 595, in get_config_dict
    config_dict, kwargs = cls._get_config_dict(pretrained_model_name_or_path, **kwargs)
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\transformers\configuration_utils.py", line 654, in _get_config_dict
    resolved_config_file = cached_file(
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\transformers\utils\hub.py", line 312, in cached_file
    file = cached_files(path_or_repo_id=path_or_repo_id, filenames=[filename], **kwargs)
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\transformers\utils\hub.py", line 533, in cached_files
    raise OSError(
OSError: You are trying to access a gated repo.
Make sure to have access to it at https://huggingface.co/meta-llama/Llama-3.2-1B-Instruct.
401 Client Error. (Request ID: Root=1-68502914-33ecb01b288368837fb1bb32;1903ba42-0113-4f37-919f-d540dbe3e225)

Cannot access gated repo for url https://huggingface.co/meta-llama/Llama-3.2-1B-Instruct/resolve/main/config.json.
Access to model meta-llama/Llama-3.2-1B-Instruct is restricted. You must have access to it and be authenticated to access it. Please log in.
[...23:24:20,997] [WARNING] [engine.py:317:run_accelerator] Failed to run Olive on cpu-cpu.
Traceback (most recent call last):
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\huggingface_hub\utils\_http.py", line 409, in hf_raise_for_status
    response.raise_for_status()
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\requests\models.py", line 1026, in raise_for_status
    raise HTTPError(http_error_msg, response=self)
requests.exceptions.HTTPError: 401 Client Error: Unauthorized for url: https://huggingface.co/meta-llama/Llama-3.2-1B-Instruct/resolve/main/config.json

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\transformers\utils\hub.py", line 470, in cached_files
    hf_hub_download(
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\huggingface_hub\utils\_validators.py", line 114, in _inner_fn
    return fn(*args, **kwargs)
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\huggingface_hub\file_download.py", line 1008, in hf_hub_download
    return _hf_hub_download_to_cache_dir(
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\huggingface_hub\file_download.py", line 1115, in _hf_hub_download_to_cache_dir
    _raise_on_head_call_error(head_call_error, force_download, local_files_only)
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\huggingface_hub\file_download.py", line 1645, in _raise_on_head_call_error
    raise head_call_error
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\huggingface_hub\file_download.py", line 1533, in _get_metadata_or_catch_error
    metadata = get_hf_file_metadata(
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\huggingface_hub\utils\_validators.py", line 114, in _inner_fn
    return fn(*args, **kwargs)
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\huggingface_hub\file_download.py", line 1450, in get_hf_file_metadata
    r = _request_wrapper(
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\huggingface_hub\file_download.py", line 286, in _request_wrapper
    response = _request_wrapper(
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\huggingface_hub\file_download.py", line 310, in _request_wrapper
    hf_raise_for_status(response)
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\huggingface_hub\utils\_http.py", line 426, in hf_raise_for_status
    raise _format(GatedRepoError, message, response) from e
huggingface_hub.errors.GatedRepoError: 401 Client Error. (Request ID: Root=1-68502914-33ecb01b288368837fb1bb32;1903ba42-0113-4f37-919f-d540dbe3e225)

Cannot access gated repo for url https://huggingface.co/meta-llama/Llama-3.2-1B-Instruct/resolve/main/config.json.
Access to model meta-llama/Llama-3.2-1B-Instruct is restricted. You must have access to it and be authenticated to access it. Please log in.

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\olive\engine\engine.py", line 313, in run_accelerator
    output_footprint = self._run_no_search(input_model_config, input_model_id, accelerator_spec, output_dir)
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\olive\engine\engine.py", line 356, in _run_no_search
    should_prune, signal, model_ids = self._run_passes(input_model_config, input_model_id, accelerator_spec)
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\olive\engine\engine.py", line 639, in _run_passes
    model_config, model_id = self._run_pass(
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\olive\engine\engine.py", line 740, in _run_pass
    output_model_config = host.run_pass(p, input_model_config, output_model_path)
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\olive\systems\local.py", line 28, in run_pass
    model = model_config.create_model()
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\olive\model\config\model_config.py", line 42, in create_model
    return cls(**self.config)
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\olive\model\handler\hf.py", line 57, in __init__
    self.model_attributes = {**self.get_hf_model_config().to_dict(), **(self.model_attributes or {})}
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\olive\model\handler\mixin\hf.py", line 42, in get_hf_model_config
    return get_model_config(self.model_path, **self.get_load_kwargs(exclude_load_keys))
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\olive\common\hf\utils.py", line 90, in get_model_config
    model_config = from_pretrained(AutoConfig, model_name_or_path, "config", **kwargs)
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\olive\common\hf\utils.py", line 85, in from_pretrained
    return cls.from_pretrained(get_pretrained_name_or_path(model_name_or_path, mlflow_dir), **kwargs)
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\transformers\models\auto\configuration_auto.py", line 1153, in from_pretrained
    config_dict, unused_kwargs = PretrainedConfig.get_config_dict(pretrained_model_name_or_path, **kwargs)
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\transformers\configuration_utils.py", line 595, in get_config_dict
    config_dict, kwargs = cls._get_config_dict(pretrained_model_name_or_path, **kwargs)
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\transformers\configuration_utils.py", line 654, in _get_config_dict
    resolved_config_file = cached_file(
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\transformers\utils\hub.py", line 312, in cached_file
    file = cached_files(path_or_repo_id=path_or_repo_id, filenames=[filename], **kwargs)
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\transformers\utils\hub.py", line 533, in cached_files
    raise OSError(
OSError: You are trying to access a gated repo.
Make sure to have access to it at https://huggingface.co/meta-llama/Llama-3.2-1B-Instruct.
401 Client Error. (Request ID: Root=1-68502914-33ecb01b288368837fb1bb32;1903ba42-0113-4f37-919f-d540dbe3e225)

Cannot access gated repo for url https://huggingface.co/meta-llama/Llama-3.2-1B-Instruct/resolve/main/config.json.
Access to model meta-llama/Llama-3.2-1B-Instruct is restricted. You must have access to it and be authenticated to access it. Please log in.

명령어를 실행하기 전 hugging face login을 하지 않았기 때문입니다. ^^ 따라서 "huggingface-cli login" 명령어를 실행해,

(huggingface-build) c:\temp> huggingface-cli login

    _|    _|  _|    _|    _|_|_|    _|_|_|  _|_|_|  _|      _|    _|_|_|      _|_|_|_|    _|_|      _|_|_|  _|_|_|_|
    _|    _|  _|    _|  _|        _|          _|    _|_|    _|  _|            _|        _|    _|  _|        _|
    _|_|_|_|  _|    _|  _|  _|_|  _|  _|_|    _|    _|  _|  _|  _|  _|_|      _|_|_|    _|_|_|_|  _|        _|_|_|
    _|    _|  _|    _|  _|    _|  _|    _|    _|    _|    _|_|  _|    _|      _|        _|    _|  _|        _|
    _|    _|    _|_|      _|_|_|    _|_|_|  _|_|_|  _|      _|    _|_|_|      _|        _|    _|    _|_|_|  _|_|_|_|

    To log in, `huggingface_hub` requires a token generated from https://huggingface.co/settings/tokens .
Token can be pasted using 'Right-Click'.
Enter your token (input will not be visible):
Add token as git credential? (Y/n) y
Token is valid (permission: read).
The token `test_token` has been saved to c:\temp\python310\cache\huggingface\stored_tokens
Your token has been saved in your configured git credential helpers (manager).
Your token has been saved to c:\temp\python310\cache\huggingface\token
Login successful.
The current active token is: `test_token`

Hugging Face 사이트를 통해 발급받은 User access token을 입력해 진행하면 됩니다. 위와 같은 경우 이후로 cache에 저장되므로 다시 로그인할 필요가 없습니다.




[이 글에 대해서 여러분들과 의견을 공유하고 싶습니다. 틀리거나 미흡한 부분 또는 의문 사항이 있으시면 언제든 댓글 남겨주십시오.]







[최초 등록일: ]
[최종 수정일: 6/19/2025]

Creative Commons License
이 저작물은 크리에이티브 커먼즈 코리아 저작자표시-비영리-변경금지 2.0 대한민국 라이센스에 따라 이용하실 수 있습니다.
by SeongTae Jeong, mailto:techsharer at outlook.com

비밀번호

댓글 작성자
 




... 16  17  18  19  20  21  22  23  [24]  25  26  27  28  29  30  ...
NoWriterDateCnt.TitleFile(s)
13377정성태6/22/202317593오류 유형: 867. 파이썬 mysqlclient 2.2.x 설치 시 "Specify MYSQLCLIENT_CFLAGS and MYSQLCLIENT_LDFLAGS env vars manually" 오류
13376정성태6/21/202313325.NET Framework: 2129. C# - Polly를 이용한 클라이언트 측의 요청 재시도파일 다운로드1
13375정성태6/20/202311713스크립트: 50. Transformers (신경망 언어모델 라이브러리) 강좌 - 2장 코드 실행 결과
13374정성태6/20/202311306오류 유형: 866. 파이썬 - <class 'AttributeError'> module 'flask.json' has no attribute 'JSONEncoder'
13373정성태6/19/202313145오류 유형: 865. 파이썬 - pymssql 설치 관련 오류 정리
13372정성태6/15/202311841개발 환경 구성: 682. SQL Server TLS 통신을 위해 사용되는 키 길이 확인 방법
13371정성태6/15/202312374개발 환경 구성: 681. openssl - 인증서 버전(V1 / V3)
13370정성태6/14/202312858개발 환경 구성: 680. C# - Ubuntu + Microsoft.Data.SqlClient + SQL Server 2008 R2 연결 방법 - TLS 1.2 지원
13369정성태6/13/202311385개발 환경 구성: 679. PyCharm(을 비롯해 JetBrains에 속한 여타) IDE에서 내부 Window들의 탭이 없어진 경우
13368정성태6/13/202312629개발 환경 구성: 678. openssl로 생성한 인증서를 SQL Server의 암호화 인증서로 설정하는 방법
13367정성태6/10/202314040오류 유형: 864. openssl로 만든 pfx 인증서를 Windows Server 2016 이하에서 등록 시 "The password you entered is incorrect" 오류 발생
13366정성태6/10/202313039.NET Framework: 2128. C# - 윈도우 시스템에서 지원하는 암호화 목록(Cipher Suites) 나열파일 다운로드1
13365정성태6/8/202310913오류 유형: 863. MODIFY FILE encountered operating system error 112(failed to retrieve text for this error. Reason: 15105)
13364정성태6/8/202314033.NET Framework: 2127. C# - Ubuntu + Microsoft.Data.SqlClient + SQL Server 2008 R2 연결 방법 [1]
13363정성태6/7/202313185스크립트: 49. 파이썬 - "Transformers (신경망 언어모델 라이브러리) 강좌" - 1장 2절 코드 실행 결과
13362정성태6/1/202312482.NET Framework: 2126. C# - 서버 측의 요청 제어 (Microsoft.AspNetCore.RateLimiting)파일 다운로드1
13361정성태5/31/202312952오류 유형: 862. Facebook - ASP.NET/WebClient 사용 시 graph.facebook.com/me 호출에 대해 403 Forbidden 오류
13360정성태5/31/202312095오류 유형: 861. WSL/docker - failed to start shim: start failed: io.containerd.runc.v2: create new shim socket
13359정성태5/19/202313024오류 유형: 860. Docker Desktop - k8s 초기화 무한 반복한다면?
13358정성태5/17/202313283.NET Framework: 2125. C# - Semantic Kernel의 Semantic Memory 사용 예제 [1]파일 다운로드1
13357정성태5/16/202312084.NET Framework: 2124. C# - Semantic Kernel의 Planner 사용 예제파일 다운로드1
13356정성태5/15/202314032DDK: 10. Device Driver 테스트 설치 관련 오류 (Code 37, Code 31) 및 인증서 관련 정리
13355정성태5/12/202312517.NET Framework: 2123. C# - Semantic Kernel의 ChatGPT 대화 구현 [1]파일 다운로드1
13354정성태5/12/202314294.NET Framework: 2122. C# - "Use Unicode UTF-8 for worldwide language support" 설정을 한 경우, 한글 입력이 '\0' 문자로 처리
13352정성태5/12/202313465.NET Framework: 2121. C# - Semantic Kernel의 대화 문맥 유지파일 다운로드1
13351정성태5/11/202313902VS.NET IDE: 185. Visual Studio - 원격 Docker container 내에 실행 중인 응용 프로그램에 대한 디버깅 [1]
... 16  17  18  19  20  21  22  23  [24]  25  26  27  28  29  30  ...