Search code examples
amazon-web-servicesamazon-s3aws-cdkaws-codepipeline

How to make a S3 deployment with the modern version of CodePipeline


I am trying to setup a brand new pipeline with the last version of AWS CDK for typescript (1.128).

The creation of the pipeline is pretty straight forward. I have added sources and build stages with no issues. The objective here is to have an automatic deployment of a static landing page.

So far I have this piece of code:

        const landingPageStep = new ShellStep(`${PREFIX}LandingPageCodeBuildStep`, {
            input: CodePipelineSource.connection(`${GIT_ORG}/vicinialandingpage`, GIT_MAIN, {
                connectionArn: GIT_CONNECTION_ARN, // Created using the AWS console
            }),
            installCommands: [
                'npm ci',
            ],
            commands: [
                'npm run build',
            ],
            primaryOutputDirectory: 'out',
        })

        const pipeline = new CodePipeline(this, `${PREFIX}Pipeline`, {
            pipelineName: `${PREFIX}Pipeline`,
            synth: new ShellStep(`${PREFIX}Synth`, {
                input: CodePipelineSource.connection(`${GIT_ORG}/viciniacdk`, GIT_MAIN, {
                    connectionArn: GIT_CONNECTION_ARN, // Created using the AWS console
                    }),
                commands: [
                    'npm ci',
                    'npm run build',
                    'npx cdk synth',
                ],
                additionalInputs: {
                    'landing_page': landingPageStep,
                },
            }),
        });

The step I am not sure how to achieve it is how to deploy to S3 using the output of "landing_page". With previous versions of Pipelines there was a heavy use of Artifacts objects and CodePipelineActions, something similar to this where sourceOutput is an Artifact object:

    const targetBucket = new s3.Bucket(this, 'MyBucket', {});

    const pipeline = new codepipeline.Pipeline(this, 'MyPipeline');
    const deployAction = new codepipeline_actions.S3DeployAction({
        actionName: 'S3Deploy',
        stage: deployStage,
        bucket: targetBucket,
        input: sourceOutput,
    });
    const deployStage = pipeline.addStage({
        stageName: 'Deploy',
        actions: [deployAction],
    });

Now it is completely different since you have access to FileSet objects and apparently the build steps are intended to be used nesting outputs as the example above. Every output file is saved in a bucket with ugly file names, so it is not intended to be accessed directly neither.

I have seen some hacky approaches replacing ShellStep by CodeBuildStep and using as a postbuild command in the buildspec.yml file something like this:

aws s3 sync out s3://cicd-codebuild-static-website/

But it is resolved in the build stage and not in a deployment stage where it will be ideal to exist.


Solution

  • You can extend Step and implement ICodePipelineActionFactory. It's an interface that gets codepipeline.IStage and adds whatever actions you need to add.

    Once you have the factory step, you pass it as either pre or post options of the addStage() method option.

    Something close to the following should work:

    class S3DeployStep extends Step implements ICodePipelineActionFactory {
      constructor(private readonly provider: codepipeline_actions.JenkinsProvider, private readonly input: FileSet) {
      }
    
      public produceAction(stage: codepipeline.IStage, options: ProduceActionOptions): CodePipelineActionFactoryResult {
    
        stage.addAction(new codepipeline_actions.S3DeployAction({
            actionName: 'S3Deploy',
            stage: deployStage,
            bucket: targetBucket,
            input: sourceOutput,
            runOrder: options.runOrder,
        }));
    
        return { runOrdersConsumed: 1 };
      }
    }
    
    // ...
    
    pipeline.addStage(stage, {post: [new S3DeployStep()]});
    

    But a way way way simpler method would be to use BucketDeployment to do it as part of the stack deployment. It creates a custom resource that copies data to a bucket from your assets or from another bucket. It won't get its own step in the pipeline and it will create a Lambda function under the hood, but it's simpler to use.