Search code examples
dvc

DVC - Forbidden: An error occurred (403) when calling the HeadObject operation


I just started with DVC. following are the steps I am doing to push my models on S3

Initialize

dvc init

Add bucket url

dvc remote add -d storage s3://mybucket/dvcstore

add some files

dvc add somefiles

Add aws keys

dvc remote modify storage access_key_id AWS_ACCESS_KEY_ID
dvc remote modify storage secret_access_key AWS_SECRET_ACCESS_KEY

now when I push

dvc push

it shows

ERROR: unexpected error - Forbidden: An error occurred (403) when calling the HeadObject operation: Forbidden

Am i missing something?

update1

result of dvc doctor

C:\my-server>dvc doctor
DVC version: 2.7.4 (pip)
---------------------------------
Platform: Python 3.8.0 on Windows-10-10.0.19041-SP0
Supports:
        http (aiohttp = 3.7.4.post0, aiohttp-retry = 2.4.5),
        https (aiohttp = 3.7.4.post0, aiohttp-retry = 2.4.5),
        s3 (s3fs = 2021.8.1, boto3 = 1.17.106)
Cache types: hardlink
Cache directory: NTFS on C:\
Caches: local
Remotes: s3
Workspace directory: NTFS on C:\
Repo: dvc, git

and the dvc push-vv

C:\my-server>dvc push -vv  
2021-09-21 13:21:38,382 TRACE: Namespace(all_branches=False, all_commits=False, all_tags=False, cd='.', cmd='push', cprofile=False, cprofile_dump=None, func=<class 'dvc.command.data_sync.CmdDataPush'>, glob=False, instrument=False, instrument_open=False, jobs=None, pdb=False, quiet=0, recursive=False, remote=None, run_cache=False, targets=[], verbose=2, version=None, with_deps=False)
2021-09-21 13:21:39,293 TRACE: Assuming 'C:\my-server\.dvc\cache\02\5b196462b86d2f10a9f659e2224da8.dir' is unchanged since 
it is read-only
2021-09-21 13:21:39,296 TRACE: Assuming 'C:\my-server\.dvc\cache\02\5b196462b86d2f10a9f659e2224da8.dir' is unchanged since 
it is read-only
2021-09-21 13:21:40,114 DEBUG: Preparing to transfer data from '.dvc\cache' to 's3://my-bucket/models'
2021-09-21 13:21:40,117 DEBUG: Preparing to collect status from 's3://my-bucket/models'
2021-09-21 13:21:40,119 DEBUG: Collecting status from 's3://my-bucket/models'
2021-09-21 13:21:40,121 DEBUG: Querying 1 hashes via object_exists
2021-09-21 13:21:44,840 ERROR: unexpected error - Forbidden: An error occurred (403) when calling the HeadObject operation: Forbidden
------------------------------------------------------------
Traceback (most recent call last):
  File "c:\users\sgarg\appdata\local\programs\python\python38\lib\site-packages\s3fs\core.py", line 248, in _call_s3
    out = await method(**additional_kwargs)
  File "c:\users\sgarg\appdata\local\programs\python\python38\lib\site-packages\aiobotocore\client.py", line 155, in _make_api_call
    raise error_class(parsed_response, operation_name)
botocore.exceptions.ClientError: An error occurred (InvalidAccessKeyId) when calling the ListObjectsV2 operation: The AWS Access Key Id you provided does not exist in our records.

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "c:\users\sgarg\appdata\local\programs\python\python38\lib\site-packages\s3fs\core.py", line 1080, in _info
    out = await self._simple_info(path)
  File "c:\users\sgarg\appdata\local\programs\python\python38\lib\site-packages\s3fs\core.py", line 993, in _simple_info
    out = await self._call_s3(
  File "c:\users\sgarg\appdata\local\programs\python\python38\lib\site-packages\s3fs\core.py", line 268, in _call_s3
    raise err
PermissionError: The AWS Access Key Id you provided does not exist in our records.

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "c:\users\sgarg\appdata\local\programs\python\python38\lib\site-packages\s3fs\core.py", line 248, in _call_s3
    out = await method(**additional_kwargs)
  File "c:\users\sgarg\appdata\local\programs\python\python38\lib\site-packages\aiobotocore\client.py", line 155, in _make_api_call
    raise error_class(parsed_response, operation_name)
botocore.exceptions.ClientError: An error occurred (403) when calling the HeadObject operation: Forbidden

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "c:\users\sgarg\appdata\local\programs\python\python38\lib\site-packages\dvc\main.py", line 55, in main
    ret = cmd.do_run()
  File "c:\users\sgarg\appdata\local\programs\python\python38\lib\site-packages\dvc\command\base.py", line 45, in do_run
    return self.run()
  File "c:\users\sgarg\appdata\local\programs\python\python38\lib\site-packages\dvc\command\data_sync.py", line 57, in run
    processed_files_count = self.repo.push(
  File "c:\users\sgarg\appdata\local\programs\python\python38\lib\site-packages\dvc\repo\__init__.py", line 50, in wrapper
    return f(repo, *args, **kwargs)
  File "c:\users\sgarg\appdata\local\programs\python\python38\lib\site-packages\dvc\repo\push.py", line 48, in push
    pushed += self.cloud.push(obj_ids, jobs, remote=remote, odb=odb)
  File "c:\users\sgarg\appdata\local\programs\python\python38\lib\site-packages\dvc\data_cloud.py", line 85, in push
    return transfer(
  File "c:\users\sgarg\appdata\local\programs\python\python38\lib\site-packages\dvc\objects\transfer.py", line 153, in transfer
    status = compare_status(src, dest, obj_ids, check_deleted=False, **kwargs)
  File "c:\users\sgarg\appdata\local\programs\python\python38\lib\site-packages\dvc\objects\status.py", line 160, in compare_status
    dest_exists, dest_missing = status(
  File "c:\users\sgarg\appdata\local\programs\python\python38\lib\site-packages\dvc\objects\status.py", line 122, in status
    exists = hashes.intersection(
  File "c:\users\sgarg\appdata\local\programs\python\python38\lib\site-packages\dvc\objects\status.py", line 48, in _indexed_dir_hashes
    dir_exists.update(odb.list_hashes_exists(dir_hashes - dir_exists))
  File "c:\users\sgarg\appdata\local\programs\python\python38\lib\site-packages\dvc\objects\db\base.py", line 415, in list_hashes_exists
    ret = list(itertools.compress(hashes, in_remote))
  File "c:\users\sgarg\appdata\local\programs\python\python38\lib\concurrent\futures\_base.py", line 611, in result_iterator
    yield fs.pop().result()
  File "c:\users\sgarg\appdata\local\programs\python\python38\lib\concurrent\futures\_base.py", line 439, in result
    return self.__get_result()
  File "c:\users\sgarg\appdata\local\programs\python\python38\lib\concurrent\futures\_base.py", line 388, in __get_result
    raise self._exception
  File "c:\users\sgarg\appdata\local\programs\python\python38\lib\concurrent\futures\thread.py", line 57, in run
    result = self.fn(*self.args, **self.kwargs)
  File "c:\users\sgarg\appdata\local\programs\python\python38\lib\site-packages\dvc\objects\db\base.py", line 406, in exists_with_progress
    ret = self.fs.exists(path_info)
  File "c:\users\sgarg\appdata\local\programs\python\python38\lib\site-packages\dvc\fs\fsspec_wrapper.py", line 97, in exists
    return self.fs.exists(self._with_bucket(path_info))
  File "c:\users\sgarg\appdata\local\programs\python\python38\lib\site-packages\fsspec\asyn.py", line 88, in wrapper
    return sync(self.loop, func, *args, **kwargs)
  File "c:\users\sgarg\appdata\local\programs\python\python38\lib\site-packages\fsspec\asyn.py", line 69, in sync
    raise result[0]
  File "c:\users\sgarg\appdata\local\programs\python\python38\lib\site-packages\fsspec\asyn.py", line 25, in _runner
    result[0] = await coro
  File "c:\users\sgarg\appdata\local\programs\python\python38\lib\site-packages\s3fs\core.py", line 820, in _exists
    await self._info(path, bucket, key, version_id=version_id)
  File "c:\users\sgarg\appdata\local\programs\python\python38\lib\site-packages\s3fs\core.py", line 1084, in _info
    out = await self._version_aware_info(path, version_id)
  File "c:\users\sgarg\appdata\local\programs\python\python38\lib\site-packages\s3fs\core.py", line 1027, in _version_aware_info
    out = await self._call_s3(
  File "c:\users\sgarg\appdata\local\programs\python\python38\lib\site-packages\s3fs\core.py", line 268, in _call_s3
    raise err
PermissionError: Forbidden
------------------------------------------------------------
2021-09-21 13:21:45,178 DEBUG: Version info for developers:
DVC version: 2.7.4 (pip)
---------------------------------
Platform: Python 3.8.0 on Windows-10-10.0.19041-SP0
Supports:
        http (aiohttp = 3.7.4.post0, aiohttp-retry = 2.4.5),
        https (aiohttp = 3.7.4.post0, aiohttp-retry = 2.4.5),
        s3 (s3fs = 2021.8.1, boto3 = 1.17.106)
Cache types: hardlink
Cache directory: NTFS on C:\
Caches: local
Remotes: s3
Workspace directory: NTFS on C:\
Repo: dvc, git

Having any troubles? Hit us up at https://dvc.org/support, we are always happy to help!
2021-09-21 13:21:45,185 DEBUG: Analytics is enabled.
2021-09-21 13:21:45,446 DEBUG: Trying to spawn '['daemon', '-q', 'analytics', 'C:\\Users\\sgarg\\AppData\\Local\\Temp\\tmpm_p9f3eq']'
2021-09-21 13:21:45,456 DEBUG: Spawned '['daemon', '-q', 'analytics', 'C:\\Users\\sgarg\\AppData\\Local\\Temp\\tmpm_p9f3eq']'

Solution

  • Could you please run dvc doctor and rerun dvc push and add -vv flag. And give the two results?

    PermissionError: The AWS Access Key Id you provided does not exist in our records.
    

    Does the aws cli works correctly for you? First setup AWS_ACCESS_KEY_ID and AWS_SECRET_ACCESS_KEY in envs then

    aws s3 ls s3://mybucket/dvcstore