Search code examples
javascriptdeploymentenvironment-variablespm2production

Copying .env files To Mutliple Host Machines w/ Pm2


I'm trying to deploy my Node.js script to multiple hosts using Pm2's deploy process.

It's working fine using a single host, with the following ecosystem.config.js file:

require("dotenv").config({ path: `./envs/.production.env` });
  const path = require("path");

  module.exports = {
    apps: [
      {
        name: process.env.APP_NAME,
        interpreter: process.env.NODE_PATH,
        cwd: process.env.PROJECT_PATH
        script: "dist/index.js",
        instances: process.env.INSTANCES || 0,
        exec_mode: "cluster",
        env: {
          ...process.env,
        },
      },
    ],
    deploy: {
      production: {
        user: "harrison",
        host: process.env.HOST,
        key: "~/.ssh/id_rsa",
        ref: "origin/master",
        repo: process.env.GIT_REPO,
        path: process.env.PROJECT_PATH,
        // Copy keys to server
        "pre-deploy-local": `scp -Cr envs harrison@${process.env.HOST}:${process.env.PROJECT_PATH}/current`,
       // Build app and restart PM2 processes
        "post-deploy": `yarn install --ignore-engines && \
           pwd && \
           yarn prod:build && \
           yarn prod:serve`,
      },
    },
  };

In order to deploy it to multiple hosts, the PM2 documentation is quite simple: Just add multiple host names. Ok, easy enough. Within my .env file, I'm using a series of IP addresses separated by commas, then splitting those into an array inside my config file, like this:

host: process.env.HOST.split(",");

However, copying over my .env files to the multiple hosts is not quite so easy.

How can I configure the "pre-deploy-local" portion of this ecosystem file to scp my .env files to every host machine?


Solution

  • Ended up running a bash script. The list of HOSTS is in my .env.production file, separated by commas, like this

    HOSTS=123.12.134.122,134.135.134.134

    To pass them to the Pm2 host, replace each comma with a space and pass them into the bash script as arguments. Then, execute that bash script on the deploy.

    My ecosystem file now looks like this:

    // PM2 CONFIGURATION FOR PRODUCTION BUILDS
    require("dotenv").config({ path: `./envs/.production.env` });
    const path = require("path");
    
    const hosts = process.env.HOSTS.replace(/,/g, " ");
    
    module.exports = {
      apps: [
        {
          name: process.env.APP_NAME,
          args: ["--color"],
          interpreter: process.env.NODE_PATH,
          cwd: path.resolve(process.env.PROJECT_PATH, "current"), // Path holding the current version of our app (where post-deploy runs)
          script: "dist/index.js", // Location of
          instances: process.env.INSTANCES || 0,
          exec_mode: "cluster",
          env: {
            ...process.env,
          },
        },
      ],
      deploy: {
        production: {
          user: "harrison",
          host: process.env.HOSTS.split(","),
          key: "~/.ssh/id_rsa2",
          ref: "origin/master",
          repo: process.env.GIT_REPO,
          // Where to deploy on the server
          path: process.env.PROJECT_PATH,
          // Pass hosts as arguments to .env copy script
          "pre-deploy-local": `./deployEnvs.sh ${
            process.env.PROJECT_PATH
          } ${hosts}`,
          "post-deploy": `yarn install --ignore-engines && \
             yarn prod:build && \
             yarn prod:serve`,
        },
      },
    };
    

    And the bash script called deployEnvs.sh looks like this:

    #!/bin/bash
    
    PROJECT_PATH="${1}"
    
    for HOST in "${@:2}"
    do
        scp -Cr envs "harrison@${HOST}:${PROJECT_PATH}/current"
    done