Search code examples
pythonamazon-s3aws-cdkaws-codepipelineaws-codebuild

codebuild stage fails with "Specified key does not exist error"" in AWS CDK 2.63.2


I have the following CDK code that creates a CloudFormation stack with an S3 bucket, codecommit and codepipeline.

    assets = s3d.BucketDeployment(
    self,
    "codeassets",
    destination_bucket=pipeline_bucket,
    role=reader_role,
    cache_control=[
s3d.CacheControl.from_string(
    "max-age=0,no-cache,no-store,must-revalidate"
)
    ],
    memory_limit=2048,
    sources=[s3d.Source.asset(assetspath)],
)


q = codecommit.CfnRepository(
    scope=self,
    code={
        "branch_name": "main",
        "s3": {"bucket": pipeline_bucket.bucket_name, "key": "code.zip"},
    },
    id="coderepo",
    repository_name=pipeline.repo,
)
q.node.add_dependency(assets)

p = codepipeline.Pipeline(
    scope=self,
    id=f"{pipeline.name}",
    pipeline_name=f"{pipeline.name}",
    restart_execution_on_update=True,
    artifact_bucket=pipeline_bucket,
)
p.node.add_dependency(q)

Even though the dependencies are added correctly, the CodePipeline stages are failing with the following error every time after the CloudFormation stack is updated.

[Container] 2023/10/17 10:01:14 Waiting for agent ping
[Container] 2023/10/17 10:01:26 Waiting for DOWNLOAD_SOURCE
NoSuchKey: The specified key does not exist.

status code: 404, request id: 3Z90D1YGC46ZQTXZ, host id: sgsg+kiejfbfbhj+sgrf+/sg= for primary source and source version arn:aws:s3:::pipeline-name-us-east-1-63bd58d0/pipeline-name/Artifact_S/SJKGDS

Provided below is one of the stage where the error pops up after the stack update.

Source stage

    p.add_stage(
        stage_name="Source",
        actions=[
            codepipeline_actions.CodeCommitSourceAction(
                action_name="CodeCommit",
                branch="main",
                output=source_artifact,
                trigger=codepipeline_actions.CodeCommitTrigger.EVENTS,
                repository=codecommit.Repository.from_repository_name(
                    self, "source_glue_repo", repository_name=pipeline.repo
                ),
            )
        ],
    )

p.add_stage(
    stage_name="UnitTests",
    actions=[
        codepipeline_actions.CodeBuildAction(
            action_name="UnitTests",
            input=source_artifact,
            project=build_project_run_tests,
            outputs=[build_artifact_run_tests],
        )
    ],
)

    build_artifact_run_tests = codepipeline.Artifact()

When I check the S3, I cannot even find the folder "pipeline-name/Artifact_S/"

But when I click on the "Release Change" button on the right top of the pipeline page in AWS console, those artifacts folders are created and pipeline executions are successful.

How do I ensure the Source Output Artifacts are in S3 before CodePipeline stage UnitTests uses it as input artifact??


Solution

  • The asset you are uploading to create the CodeCommit repo overwrites the repository files on every update.

    You cannot change the content of the CodeCommit repo after it's created, so any subsequent changes to the code prop are ignored. Instead, whenever your code asset changes, your BucketDeployment construct removes any existing files in the bucket and replaces them with your asset. Set prune to false to disable this behavior.