Microsoft MVP성태의 닷넷 이야기
글쓴 사람
정성태 (seongtaejeong at gmail.com)
홈페이지
첨부 파일
 

Hugging Face 모델 다운로드 시 "requests.exceptions.HTTPError: 401 Client Error: Unauthorized for url: ..." 오류

예를 들어 아래의 명령어를 실행했는데,

olive auto-opt --model_name_or_path meta-llama/Llama-3.2-1B-Instruct --trust_remote_code --output_path models/llama --device cpu --provider CPUExecutionProvider --use_ort_genai --precision int4 --log_level 1

이런 오류가 발생한다면?

Loading HuggingFace model from meta-llama/Llama-3.2-1B-Instruct
[...23:11:24,904] [INFO] [run.py:142:run_engine] Running workflow default_workflow
[...23:11:24,907] [INFO] [cache.py:138:__init__] Using cache directory: C:\foundry_cache\.olive-cache\default_workflow
[...23:11:24,910] [INFO] [accelerator_creator.py:217:create_accelerators] Running workflow on accelerator specs: cpu-cpu
[...23:11:24,913] [INFO] [engine.py:223:run] Running Olive on accelerator: cpu-cpu
[...23:11:24,914] [INFO] [engine.py:864:_create_system] Creating target system ...
[...23:11:24,914] [INFO] [engine.py:867:_create_system] Target system created in 0.000000 seconds
[...23:11:24,915] [INFO] [engine.py:879:_create_system] Creating host system ...
[...23:11:24,915] [INFO] [engine.py:882:_create_system] Host system created in 0.000000 seconds
[...23:11:27,564] [INFO] [engine.py:683:_run_pass] Running pass conversion:onnxconversion
[...23:11:27,806] [ERROR] [engine.py:752:_run_pass] Pass run failed.
Traceback (most recent call last):
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\huggingface_hub\utils\_http.py", line 409, in hf_raise_for_status
    response.raise_for_status()
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\requests\models.py", line 1026, in raise_for_status
    raise HTTPError(http_error_msg, response=self)
requests.exceptions.HTTPError: 401 Client Error: Unauthorized for url: https://huggingface.co/meta-llama/Llama-3.2-1B-Instruct/resolve/main/config.json

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\transformers\utils\hub.py", line 470, in cached_files
    hf_hub_download(
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\huggingface_hub\utils\_validators.py", line 114, in _inner_fn
    return fn(*args, **kwargs)
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\huggingface_hub\file_download.py", line 1008, in hf_hub_download
    return _hf_hub_download_to_cache_dir(
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\huggingface_hub\file_download.py", line 1115, in _hf_hub_download_to_cache_dir
    _raise_on_head_call_error(head_call_error, force_download, local_files_only)
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\huggingface_hub\file_download.py", line 1645, in _raise_on_head_call_error
    raise head_call_error
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\huggingface_hub\file_download.py", line 1533, in _get_metadata_or_catch_error
    metadata = get_hf_file_metadata(
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\huggingface_hub\utils\_validators.py", line 114, in _inner_fn
    return fn(*args, **kwargs)
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\huggingface_hub\file_download.py", line 1450, in get_hf_file_metadata
    r = _request_wrapper(
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\huggingface_hub\file_download.py", line 286, in _request_wrapper
    response = _request_wrapper(
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\huggingface_hub\file_download.py", line 310, in _request_wrapper
    hf_raise_for_status(response)
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\huggingface_hub\utils\_http.py", line 426, in hf_raise_for_status
    raise _format(GatedRepoError, message, response) from e
huggingface_hub.errors.GatedRepoError: 401 Client Error. (Request ID: Root=1-6850260f-2937b0d24ae1a494677199c0;72657359-ba7a-455c-b388-dccc23a6d25c)

Cannot access gated repo for url https://huggingface.co/meta-llama/Llama-3.2-1B-Instruct/resolve/main/config.json.
Access to model meta-llama/Llama-3.2-1B-Instruct is restricted. You must have access to it and be authenticated to access it. Please log in.

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\olive\engine\engine.py", line 740, in _run_pass
    output_model_config = host.run_pass(p, input_model_config, output_model_path)
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\olive\systems\local.py", line 28, in run_pass
    model = model_config.create_model()
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\olive\model\config\model_config.py", line 42, in create_model
    return cls(**self.config)
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\olive\model\handler\hf.py", line 57, in __init__
    self.model_attributes = {**self.get_hf_model_config().to_dict(), **(self.model_attributes or {})}
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\olive\model\handler\mixin\hf.py", line 42, in get_hf_model_config
    return get_model_config(self.model_path, **self.get_load_kwargs(exclude_load_keys))
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\olive\common\hf\utils.py", line 90, in get_model_config
    model_config = from_pretrained(AutoConfig, model_name_or_path, "config", **kwargs)
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\olive\common\hf\utils.py", line 85, in from_pretrained
    return cls.from_pretrained(get_pretrained_name_or_path(model_name_or_path, mlflow_dir), **kwargs)
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\transformers\models\auto\configuration_auto.py", line 1153, in from_pretrained
    config_dict, unused_kwargs = PretrainedConfig.get_config_dict(pretrained_model_name_or_path, **kwargs)
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\transformers\configuration_utils.py", line 595, in get_config_dict
    config_dict, kwargs = cls._get_config_dict(pretrained_model_name_or_path, **kwargs)
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\transformers\configuration_utils.py", line 654, in _get_config_dict
    resolved_config_file = cached_file(
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\transformers\utils\hub.py", line 312, in cached_file
    file = cached_files(path_or_repo_id=path_or_repo_id, filenames=[filename], **kwargs)
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\transformers\utils\hub.py", line 533, in cached_files
    raise OSError(
OSError: You are trying to access a gated repo.
Make sure to have access to it at https://huggingface.co/meta-llama/Llama-3.2-1B-Instruct.
401 Client Error. (Request ID: Root=1-6850260f-2937b0d24ae1a494677199c0;72657359-ba7a-455c-b388-dccc23a6d25c)

Cannot access gated repo for url https://huggingface.co/meta-llama/Llama-3.2-1B-Instruct/resolve/main/config.json.
Access to model meta-llama/Llama-3.2-1B-Instruct is restricted. You must have access to it and be authenticated to access it. Please log in.
[...23:11:27,815] [WARNING] [engine.py:317:run_accelerator] Failed to run Olive on cpu-cpu.
Traceback (most recent call last):
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\huggingface_hub\utils\_http.py", line 409, in hf_raise_for_status
    response.raise_for_status()
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\requests\models.py", line 1026, in raise_for_status
    raise HTTPError(http_error_msg, response=self)
requests.exceptions.HTTPError: 401 Client Error: Unauthorized for url: https://huggingface.co/meta-llama/Llama-3.2-1B-Instruct/resolve/main/config.json

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\transformers\utils\hub.py", line 470, in cached_files
    hf_hub_download(
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\huggingface_hub\utils\_validators.py", line 114, in _inner_fn
    return fn(*args, **kwargs)
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\huggingface_hub\file_download.py", line 1008, in hf_hub_download
    return _hf_hub_download_to_cache_dir(
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\huggingface_hub\file_download.py", line 1115, in _hf_hub_download_to_cache_dir
    _raise_on_head_call_error(head_call_error, force_download, local_files_only)
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\huggingface_hub\file_download.py", line 1645, in _raise_on_head_call_error
    raise head_call_error
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\huggingface_hub\file_download.py", line 1533, in _get_metadata_or_catch_error
    metadata = get_hf_file_metadata(
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\huggingface_hub\utils\_validators.py", line 114, in _inner_fn
    return fn(*args, **kwargs)
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\huggingface_hub\file_download.py", line 1450, in get_hf_file_metadata
    r = _request_wrapper(
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\huggingface_hub\file_download.py", line 286, in _request_wrapper
    response = _request_wrapper(
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\huggingface_hub\file_download.py", line 310, in _request_wrapper
    hf_raise_for_status(response)
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\huggingface_hub\utils\_http.py", line 426, in hf_raise_for_status
    raise _format(GatedRepoError, message, response) from e
huggingface_hub.errors.GatedRepoError: 401 Client Error. (Request ID: Root=1-6850260f-2937b0d24ae1a494677199c0;72657359-ba7a-455c-b388-dccc23a6d25c)

Cannot access gated repo for url https://huggingface.co/meta-llama/Llama-3.2-1B-Instruct/resolve/main/config.json.
Access to model meta-llama/Llama-3.2-1B-Instruct is restricted. You must have access to it and be authenticated to access it. Please log in.

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\olive\engine\engine.py", line 313, in run_accelerator
    output_footprint = self._run_no_search(input_model_config, input_model_id, accelerator_spec, output_dir)
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\olive\engine\engine.py", line 356, in _run_no_search
    should_prune, signal, model_ids = self._run_passes(input_model_config, input_model_id, accelerator_spec)
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\olive\engine\engine.py", line 639, in _run_passes
    model_config, model_id = self._run_pass(
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\olive\engine\engine.py", line 740, in _run_pass
    output_model_config = host.run_pass(p, input_model_config, output_model_path)
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\olive\systems\local.py", line 28, in run_pass
    model = model_config.create_model()
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\olive\model\config\model_config.py", line 42, in create_model
    return cls(**self.config)
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\olive\model\handler\hf.py", line 57, in __init__
    self.model_attributes = {**self.get_hf_model_config().to_dict(), **(self.model_attributes or {})}
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\olive\model\handler\mixin\hf.py", line 42, in get_hf_model_config
    return get_model_config(self.model_path, **self.get_load_kwargs(exclude_load_keys))
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\olive\common\hf\utils.py", line 90, in get_model_config
    model_config = from_pretrained(AutoConfig, model_name_or_path, "config", **kwargs)
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\olive\common\hf\utils.py", line 85, in from_pretrained
    return cls.from_pretrained(get_pretrained_name_or_path(model_name_or_path, mlflow_dir), **kwargs)
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\transformers\models\auto\configuration_auto.py", line 1153, in from_pretrained
    config_dict, unused_kwargs = PretrainedConfig.get_config_dict(pretrained_model_name_or_path, **kwargs)
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\transformers\configuration_utils.py", line 595, in get_config_dict
    config_dict, kwargs = cls._get_config_dict(pretrained_model_name_or_path, **kwargs)
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\transformers\configuration_utils.py", line 654, in _get_config_dict
    resolved_config_file = cached_file(
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\transformers\utils\hub.py", line 312, in cached_file
    file = cached_files(path_or_repo_id=path_or_repo_id, filenames=[filename], **kwargs)
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\transformers\utils\hub.py", line 533, in cached_files
    raise OSError(
OSError: You are trying to access a gated repo.
Make sure to have access to it at https://huggingface.co/meta-llama/Llama-3.2-1B-Instruct.
401 Client Error. (Request ID: Root=1-6850260f-2937b0d24ae1a494677199c0;72657359-ba7a-455c-b388-dccc23a6d25c)

Cannot access gated repo for url https://huggingface.co/meta-llama/Llama-3.2-1B-Instruct/resolve/main/config.json.
Access to model meta-llama/Llama-3.2-1B-Instruct is restricted. You must have access to it and be authenticated to access it. Please log in.
[...23:11:27,817] [INFO] [engine.py:241:run] Run history for cpu-cpu:
[...23:11:27,818] [INFO] [engine.py:499:dump_run_history] Please install tabulate for better run history output

메시지에 나온 링크를 방문하면,

https://huggingface.co/meta-llama/Llama-3.2-1B-Instruct/resolve/main/config.json

아래와 같이 원인을 알려줍니다.

Access to model meta-llama/Llama-3.2-1B-Instruct is restricted and you are not in the authorized list. Visit https://huggingface.co/meta-llama/Llama-3.2-1B-Instruct to ask for access.

즉, 저런 식의 오류가 발생하는 경우에는 해당 모델 설명 페이지에 가서 접근 권한을 요청해야 합니다. 실제로 저 링크의 페이지를 방문하면 아래와 같이 "Expand to review and access" 버튼이 나오는데요,

hf_request_model_access_1.png

(HuggingFace에 로그인한 후) 저걸 펼쳐서 약간의 정보를 입력하고 "Submit" 버튼을 누르는 절차를 거쳐야 합니다. 이후, 따로 승인 절차는 없고 곧바로 액세스가 허용됩니다.




만약 그래도 이런 오류가 발생한다면?

Loading HuggingFace model from meta-llama/Llama-3.2-1B-Instruct
[...23:24:18,060] [INFO] [run.py:142:run_engine] Running workflow default_workflow
[...23:24:18,063] [INFO] [cache.py:138:__init__] Using cache directory: C:\foundry_cache\.olive-cache\default_workflow
[...23:24:18,067] [INFO] [accelerator_creator.py:217:create_accelerators] Running workflow on accelerator specs: cpu-cpu
[...23:24:18,070] [INFO] [engine.py:223:run] Running Olive on accelerator: cpu-cpu
[...23:24:18,070] [INFO] [engine.py:864:_create_system] Creating target system ...
[...23:24:18,071] [INFO] [engine.py:867:_create_system] Target system created in 0.000000 seconds
[...23:24:18,071] [INFO] [engine.py:879:_create_system] Creating host system ...
[...23:24:18,071] [INFO] [engine.py:882:_create_system] Host system created in 0.000000 seconds
[...23:24:20,772] [INFO] [engine.py:683:_run_pass] Running pass conversion:onnxconversion
[...23:24:20,989] [ERROR] [engine.py:752:_run_pass] Pass run failed.
Traceback (most recent call last):
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\huggingface_hub\utils\_http.py", line 409, in hf_raise_for_status
    response.raise_for_status()
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\requests\models.py", line 1026, in raise_for_status
    raise HTTPError(http_error_msg, response=self)
requests.exceptions.HTTPError: 401 Client Error: Unauthorized for url: https://huggingface.co/meta-llama/Llama-3.2-1B-Instruct/resolve/main/config.json

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\transformers\utils\hub.py", line 470, in cached_files
    hf_hub_download(
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\huggingface_hub\utils\_validators.py", line 114, in _inner_fn
    return fn(*args, **kwargs)
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\huggingface_hub\file_download.py", line 1008, in hf_hub_download
    return _hf_hub_download_to_cache_dir(
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\huggingface_hub\file_download.py", line 1115, in _hf_hub_download_to_cache_dir
    _raise_on_head_call_error(head_call_error, force_download, local_files_only)
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\huggingface_hub\file_download.py", line 1645, in _raise_on_head_call_error
    raise head_call_error
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\huggingface_hub\file_download.py", line 1533, in _get_metadata_or_catch_error
    metadata = get_hf_file_metadata(
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\huggingface_hub\utils\_validators.py", line 114, in _inner_fn
    return fn(*args, **kwargs)
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\huggingface_hub\file_download.py", line 1450, in get_hf_file_metadata
    r = _request_wrapper(
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\huggingface_hub\file_download.py", line 286, in _request_wrapper
    response = _request_wrapper(
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\huggingface_hub\file_download.py", line 310, in _request_wrapper
    hf_raise_for_status(response)
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\huggingface_hub\utils\_http.py", line 426, in hf_raise_for_status
    raise _format(GatedRepoError, message, response) from e
huggingface_hub.errors.GatedRepoError: 401 Client Error. (Request ID: Root=1-68502914-33ecb01b288368837fb1bb32;1903ba42-0113-4f37-919f-d540dbe3e225)

Cannot access gated repo for url https://huggingface.co/meta-llama/Llama-3.2-1B-Instruct/resolve/main/config.json.
Access to model meta-llama/Llama-3.2-1B-Instruct is restricted. You must have access to it and be authenticated to access it. Please log in.

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\olive\engine\engine.py", line 740, in _run_pass
    output_model_config = host.run_pass(p, input_model_config, output_model_path)
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\olive\systems\local.py", line 28, in run_pass
    model = model_config.create_model()
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\olive\model\config\model_config.py", line 42, in create_model
    return cls(**self.config)
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\olive\model\handler\hf.py", line 57, in __init__
    self.model_attributes = {**self.get_hf_model_config().to_dict(), **(self.model_attributes or {})}
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\olive\model\handler\mixin\hf.py", line 42, in get_hf_model_config
    return get_model_config(self.model_path, **self.get_load_kwargs(exclude_load_keys))
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\olive\common\hf\utils.py", line 90, in get_model_config
    model_config = from_pretrained(AutoConfig, model_name_or_path, "config", **kwargs)
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\olive\common\hf\utils.py", line 85, in from_pretrained
    return cls.from_pretrained(get_pretrained_name_or_path(model_name_or_path, mlflow_dir), **kwargs)
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\transformers\models\auto\configuration_auto.py", line 1153, in from_pretrained
    config_dict, unused_kwargs = PretrainedConfig.get_config_dict(pretrained_model_name_or_path, **kwargs)
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\transformers\configuration_utils.py", line 595, in get_config_dict
    config_dict, kwargs = cls._get_config_dict(pretrained_model_name_or_path, **kwargs)
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\transformers\configuration_utils.py", line 654, in _get_config_dict
    resolved_config_file = cached_file(
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\transformers\utils\hub.py", line 312, in cached_file
    file = cached_files(path_or_repo_id=path_or_repo_id, filenames=[filename], **kwargs)
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\transformers\utils\hub.py", line 533, in cached_files
    raise OSError(
OSError: You are trying to access a gated repo.
Make sure to have access to it at https://huggingface.co/meta-llama/Llama-3.2-1B-Instruct.
401 Client Error. (Request ID: Root=1-68502914-33ecb01b288368837fb1bb32;1903ba42-0113-4f37-919f-d540dbe3e225)

Cannot access gated repo for url https://huggingface.co/meta-llama/Llama-3.2-1B-Instruct/resolve/main/config.json.
Access to model meta-llama/Llama-3.2-1B-Instruct is restricted. You must have access to it and be authenticated to access it. Please log in.
[...23:24:20,997] [WARNING] [engine.py:317:run_accelerator] Failed to run Olive on cpu-cpu.
Traceback (most recent call last):
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\huggingface_hub\utils\_http.py", line 409, in hf_raise_for_status
    response.raise_for_status()
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\requests\models.py", line 1026, in raise_for_status
    raise HTTPError(http_error_msg, response=self)
requests.exceptions.HTTPError: 401 Client Error: Unauthorized for url: https://huggingface.co/meta-llama/Llama-3.2-1B-Instruct/resolve/main/config.json

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\transformers\utils\hub.py", line 470, in cached_files
    hf_hub_download(
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\huggingface_hub\utils\_validators.py", line 114, in _inner_fn
    return fn(*args, **kwargs)
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\huggingface_hub\file_download.py", line 1008, in hf_hub_download
    return _hf_hub_download_to_cache_dir(
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\huggingface_hub\file_download.py", line 1115, in _hf_hub_download_to_cache_dir
    _raise_on_head_call_error(head_call_error, force_download, local_files_only)
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\huggingface_hub\file_download.py", line 1645, in _raise_on_head_call_error
    raise head_call_error
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\huggingface_hub\file_download.py", line 1533, in _get_metadata_or_catch_error
    metadata = get_hf_file_metadata(
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\huggingface_hub\utils\_validators.py", line 114, in _inner_fn
    return fn(*args, **kwargs)
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\huggingface_hub\file_download.py", line 1450, in get_hf_file_metadata
    r = _request_wrapper(
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\huggingface_hub\file_download.py", line 286, in _request_wrapper
    response = _request_wrapper(
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\huggingface_hub\file_download.py", line 310, in _request_wrapper
    hf_raise_for_status(response)
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\huggingface_hub\utils\_http.py", line 426, in hf_raise_for_status
    raise _format(GatedRepoError, message, response) from e
huggingface_hub.errors.GatedRepoError: 401 Client Error. (Request ID: Root=1-68502914-33ecb01b288368837fb1bb32;1903ba42-0113-4f37-919f-d540dbe3e225)

Cannot access gated repo for url https://huggingface.co/meta-llama/Llama-3.2-1B-Instruct/resolve/main/config.json.
Access to model meta-llama/Llama-3.2-1B-Instruct is restricted. You must have access to it and be authenticated to access it. Please log in.

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\olive\engine\engine.py", line 313, in run_accelerator
    output_footprint = self._run_no_search(input_model_config, input_model_id, accelerator_spec, output_dir)
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\olive\engine\engine.py", line 356, in _run_no_search
    should_prune, signal, model_ids = self._run_passes(input_model_config, input_model_id, accelerator_spec)
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\olive\engine\engine.py", line 639, in _run_passes
    model_config, model_id = self._run_pass(
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\olive\engine\engine.py", line 740, in _run_pass
    output_model_config = host.run_pass(p, input_model_config, output_model_path)
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\olive\systems\local.py", line 28, in run_pass
    model = model_config.create_model()
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\olive\model\config\model_config.py", line 42, in create_model
    return cls(**self.config)
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\olive\model\handler\hf.py", line 57, in __init__
    self.model_attributes = {**self.get_hf_model_config().to_dict(), **(self.model_attributes or {})}
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\olive\model\handler\mixin\hf.py", line 42, in get_hf_model_config
    return get_model_config(self.model_path, **self.get_load_kwargs(exclude_load_keys))
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\olive\common\hf\utils.py", line 90, in get_model_config
    model_config = from_pretrained(AutoConfig, model_name_or_path, "config", **kwargs)
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\olive\common\hf\utils.py", line 85, in from_pretrained
    return cls.from_pretrained(get_pretrained_name_or_path(model_name_or_path, mlflow_dir), **kwargs)
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\transformers\models\auto\configuration_auto.py", line 1153, in from_pretrained
    config_dict, unused_kwargs = PretrainedConfig.get_config_dict(pretrained_model_name_or_path, **kwargs)
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\transformers\configuration_utils.py", line 595, in get_config_dict
    config_dict, kwargs = cls._get_config_dict(pretrained_model_name_or_path, **kwargs)
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\transformers\configuration_utils.py", line 654, in _get_config_dict
    resolved_config_file = cached_file(
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\transformers\utils\hub.py", line 312, in cached_file
    file = cached_files(path_or_repo_id=path_or_repo_id, filenames=[filename], **kwargs)
  File "C:\Users\testusr\anaconda3\envs\huggingface-build\lib\site-packages\transformers\utils\hub.py", line 533, in cached_files
    raise OSError(
OSError: You are trying to access a gated repo.
Make sure to have access to it at https://huggingface.co/meta-llama/Llama-3.2-1B-Instruct.
401 Client Error. (Request ID: Root=1-68502914-33ecb01b288368837fb1bb32;1903ba42-0113-4f37-919f-d540dbe3e225)

Cannot access gated repo for url https://huggingface.co/meta-llama/Llama-3.2-1B-Instruct/resolve/main/config.json.
Access to model meta-llama/Llama-3.2-1B-Instruct is restricted. You must have access to it and be authenticated to access it. Please log in.

명령어를 실행하기 전 hugging face login을 하지 않았기 때문입니다. ^^ 따라서 "huggingface-cli login" 명령어를 실행해,

(huggingface-build) c:\temp> huggingface-cli login

    _|    _|  _|    _|    _|_|_|    _|_|_|  _|_|_|  _|      _|    _|_|_|      _|_|_|_|    _|_|      _|_|_|  _|_|_|_|
    _|    _|  _|    _|  _|        _|          _|    _|_|    _|  _|            _|        _|    _|  _|        _|
    _|_|_|_|  _|    _|  _|  _|_|  _|  _|_|    _|    _|  _|  _|  _|  _|_|      _|_|_|    _|_|_|_|  _|        _|_|_|
    _|    _|  _|    _|  _|    _|  _|    _|    _|    _|    _|_|  _|    _|      _|        _|    _|  _|        _|
    _|    _|    _|_|      _|_|_|    _|_|_|  _|_|_|  _|      _|    _|_|_|      _|        _|    _|    _|_|_|  _|_|_|_|

    To log in, `huggingface_hub` requires a token generated from https://huggingface.co/settings/tokens .
Token can be pasted using 'Right-Click'.
Enter your token (input will not be visible):
Add token as git credential? (Y/n) y
Token is valid (permission: read).
The token `test_token` has been saved to c:\temp\python310\cache\huggingface\stored_tokens
Your token has been saved in your configured git credential helpers (manager).
Your token has been saved to c:\temp\python310\cache\huggingface\token
Login successful.
The current active token is: `test_token`

Hugging Face 사이트를 통해 발급받은 User access token을 입력해 진행하면 됩니다. 위와 같은 경우 이후로 cache에 저장되므로 다시 로그인할 필요가 없습니다.




[이 글에 대해서 여러분들과 의견을 공유하고 싶습니다. 틀리거나 미흡한 부분 또는 의문 사항이 있으시면 언제든 댓글 남겨주십시오.]







[최초 등록일: ]
[최종 수정일: 6/19/2025]

Creative Commons License
이 저작물은 크리에이티브 커먼즈 코리아 저작자표시-비영리-변경금지 2.0 대한민국 라이센스에 따라 이용하실 수 있습니다.
by SeongTae Jeong, mailto:techsharer at outlook.com

비밀번호

댓글 작성자
 




... 76  77  78  79  [80]  81  82  83  84  85  86  87  88  89  90  ...
NoWriterDateCnt.TitleFile(s)
11970정성태7/2/201920205오류 유형: 552. 웹 브라우저에서 파일 다운로드 후 "Running security scan"이 끝나지 않는 문제
11969정성태7/2/201920734Math: 63. C# - 3층 구조의 신경망파일 다운로드1
11968정성태7/1/201927428오류 유형: 551. Visual Studio Code에서 Remote-SSH 연결 시 "Opening Remote..." 단계에서 진행되지 않는 문제 [1]
11967정성태7/1/201921547개발 환경 구성: 446. Synology NAS를 Windows 10에서 iSCSI로 연결하는 방법
11966정성태6/30/201920427Math: 62. 활성화 함수에 따른 뉴런의 출력을 그리드 맵으로 시각화파일 다운로드1
11965정성태6/30/201921008.NET Framework: 846. C# - 2차원 배열을 1차원 배열로 나열하는 확장 메서드파일 다운로드1
11964정성태6/30/201922079Linux: 20. C# - Linux에서의 Named Pipe를 이용한 통신
11963정성태6/29/201921761Linux: 19. C# - .NET Core Unix Domain Socket 사용 예제
11962정성태6/27/201919310Math: 61. C# - 로지스틱 회귀를 이용한 선형분리 불가능 문제의 분류파일 다운로드1
11961정성태6/27/201919473Graphics: 37. C# - PLplot - 출력 모음(Family File Output)
11960정성태6/27/201920543Graphics: 36. C# - PLplot의 16색 이상을 표현하는 방법과 subpage를 이용한 그리드 맵 표현
11959정성태6/27/201921625Graphics: 35. matplotlib와 PLplot의 한글 처리
11958정성태6/25/201926146Linux: 18. C# - .NET Core Console로 리눅스 daemon 프로그램 만드는 방법 [6]
11957정성태6/24/201923868Windows: 160. WMI 쿼리를 명령행에서 간단하게 수행하는 wmic.exe [2]
11956정성태6/24/201923099Linux: 17. CentOS 7에서 .NET Core Web App 실행 환경 구성 [1]
11955정성태6/20/201921168Math: 60. C# - 로지스틱 회귀를 이용한 분류파일 다운로드1
11954정성태6/20/201919491오류 유형: 550. scp - sudo: no tty present and no askpass program specified
11953정성태6/20/201917486오류 유형: 549. The library 'libhostpolicy.so' required to execute the application was not found in '...'
11952정성태6/20/201918524Linux: 16. 우분투, Centos의 Netbios 호스트 이름 풀이 방법
11951정성태6/20/201921689오류 유형: 548. scp 연결 시 "Permission denied" 오류 및 "WARNING: REMOTE HOST IDENTIFICATION HAS CHANGED!" 경고
11950정성태6/18/201922540.NET Framework: 845. C# - 윈도우 작업 관리자와 리소스 모니터의 메모리 값을 구하는 방법
11949정성태6/18/201917708오류 유형: 547. CoreCLR Profiler 예제 프로젝트 빌드 시 컴파일 오류 유형
11948정성태6/17/201919614Linux: 15. 리눅스 환경의 Visual Studio Code에서 TFS 서버 연동
11947정성태6/17/201922057Linux: 14. 리눅스 환경에서 TFS 서버 연동
11946정성태6/17/201922908개발 환경 구성: 445. C# - MathNet으로 정규 분포를 따르는 데이터를 생성, PLplot으로 Histogram 표현파일 다운로드1
11945정성태6/17/201920534Linux: 13. node.js에서 syslog로 출력하는 방법
... 76  77  78  79  [80]  81  82  83  84  85  86  87  88  89  90  ...