Search code examples
node.jsamazon-s3next.jsaws-sdkaws-sdk-js

Upload static web to AWS s3 with Node JS break static web hosting


I am planning to deploy next.js static export to s3 bucket with Node JS script.

I have setup a S3 bucket for static website hosting.

I get the expected behaviour when I simply drag and drop the static export to the S3 bucket so I pretty sure I setup the S3 bucket correctly.

But when I try to upload it with a Node JS script, despite all the files appear in the bucket, the behaviour of static website hosting seems to break.

My script is basically copy from with a twist on reading the environment variable on .env:

Upload entire directory tree to S3 using AWS sdk in node js

More Info and steps to reproduce this problem is in the Testing Repo.

Testing Repo:

https://github.com/vmia159/aws-upload-test

I appreciate if someone can have an idea about the issues.


Solution

  • https://github.com/aws/aws-sdk-js/issues/4279

    As chrisradek points out, you need to provide content type to make it work

    require('dotenv').config();
    const bucketName = process.env.BUCKET_NAME;
    const { promises: fs, createReadStream } = require('fs');
    const path = require('path');
    const { S3 } = require('aws-sdk');
    const mime = require('mime-types');
    
    const s3 = new S3({
      accessKeyId:  process.env.AWS_ACCESS_KEY_ID,
      secretAccessKey: process.env.AWS_SECRET_ACCESS_KEY
    });
    
    const uploadDir = async (s3Path, bucketName) => {
      // Recursive getFiles from
      // https://stackoverflow.com/a/45130990/831465
    
      async function getFiles(dir) {
        const dirents = await fs.readdir(dir, { withFileTypes: true });
        const files = await Promise.all(
          dirents.map(dirent => {
            const res = path.resolve(dir, dirent.name);
            return dirent.isDirectory() ? getFiles(res) : res;
          })
        );
        return Array.prototype.concat(...files);
      }
    
      const files = await getFiles(s3Path);
      const uploads = files.map(filePath =>
        s3
          .putObject({
            Key: path.relative(s3Path, filePath),
            Bucket: bucketName,
            Body: createReadStream(filePath),
            ContentType: mime.lookup(filePath)
          })
          .promise()
          .catch(err => {
            console.log(`fail to upload ${filePath}`);
          })
      );
      return Promise.all(uploads);
    };
    
    const uploadProcess = async () => {
      await uploadDir(path.resolve('./out'), bucketName);
      console.log('Upload finish');
    };
    uploadProcess();