Search code examples
amazon-web-servicesamazon-ec2amazon-vpc

How do I run the EC2 Classic Resource Finder script for MacOS and create the appropriate IAM user for the script?


The EC2 Classic Resource Finder script is AWS's recommended script to check for any resources you may be using in AWS on the EC2-Classic network. EC2-Classic is being "discontinued" as of 8/15/22.

Having never used Python, pip, or Boto3, I want to document how I set up and ran the script on my Macbook in case it will help anyone else prior to the 8/15 deadline. I imagine some folks may be scrambling as the deadline nears and hopefully this will save you a few hours.

This assumes you are using a recent version of MacOS (I am on Monterrey 12.4 but I suspect earlier versions will work the same) and have access to your AWS account with the ability to create new IAM users.


Solution

  • The script requires Boto3, the AWS SDK for Python. Naturally, this also requires Python as well as Pip (the package installer for Python).

    Install Python

    1. Install the latest version of Python by downloading the latest macOS 64-bit universal2 installer on the page.
    2. Follow the setup steps and be sure to do the last step -- double-click the "Install Certificates.command" file to install the root ssl certificates.

    Install pip and Boto3

    1. Open up Terminal (CMD-SPACE > "Terminal")
    2. Type the following commands and ENTER each time (Thanks to roktech on Youtube):
      1. Download pip: curl https://bootstrap.pypa.io/get-pip.py -o get-pip.py
      2. Install pip: python3 get-pip.py
      3. Install Boto3: pip install boto3

    Install AWS CLI

    This will expedite setting up your AWS credentials with Boto3. The installation file is on the right-margin of the AWS CLI page.

    Create the appropriate IAM user in your AWS account

    In order to execute the Python script, you need to create an IAM user with the requisite permissions in the AWS account. There are many ways to set up the IAM user but, my simple approach was:

    Create an IAM policy specific to the needs of the script.

    The Github readme file lists the resources that the IAM user will need but doesn't just give you the direct JSON policy. So, here it is:

    Log into AWS, type "IAM" on the topbar search, select "IAM" under Services, then "Policies" on the left sidebar menu. Then create a new policy. Click the "JSON" tab and enter the following JSON in the text area.

        "Version": "2012-10-17",
        "Statement": [
            {
                "Effect": "Allow",
                "Action": [
                    "autoscaling:DescribeAutoScalingGroups",
                    "datapipeline:GetPipelineDefinition",
                    "datapipeline:ListPipelines",
                    "ec2:DescribeAccountAttributes",
                    "ec2:DescribeAddresses",
                    "ec2:DescribeInstances",
                    "ec2:DescribeRegions",
                    "ec2:DescribeSecurityGroups",
                    "ec2:DescribeVpcClassicLink",
                    "elasticbeanstalk:DescribeConfigurationSettings",
                    "elasticbeanstalk:DescribeEnvironments",
                    "elasticache:DescribeCacheClusters",
                    "elasticloadbalancing:DescribeLoadBalancers",
                    "elasticmapreduce:DescribeCluster",
                    "elasticmapreduce:ListBootstrapActions",
                    "elasticmapreduce:ListClusters",
                    "elasticmapreduce:ListInstanceGroups",
                    "rds:DescribeDBInstances",
                    "redshift:DescribeClusters",
                    "opsworks:DescribeStacks",
                    "sts:GetCallerIdentity"
                ],
                "Resource": [
                    "*"
                ]
            }
        ]
    }
    

    Note: If you are using "Elastic Beanstalk", you will additionally need to add the other bulleted resources per the readme file in a similar fashion in the Actions array above.

    Then, go through the remaining steps, give it a name (e.g. EC2ClassicResourceFinderScriptPolicy), then save:

    Next, add a new user with the policy

    1. Go to "Users" on the left sidebar and "Add users".
    2. Enter a name for the user (e.g. EC2ClassicResourceFinderScriptUser) and attach the newly created policy to this user.
    3. Under "Select AWS access type", choose "Access key - Programmatic access"
    4. Go through the remaining steps and save, copying the AWS Access Key and AWS Secret Key when provided.

    Set up the IAM user on the AWS CLI

    1. Open up terminal again and type aws configure
    2. When prompted, enter the AWS Access Key and Secret Key, as well as the default region (e.g. us-east-1). For output format, type table (this will result in outputting CSV files after running the script).

    Download and run the script (!)

    1. Download the raw script to your desktop.
    2. Open up terminal again and change to your desktop directory (e.g. cd desktop)
    3. Type ls and ensure the py-Classic-Resource-Finder.py script is there.
    4. Then run the script by typing python3 py-Classic-Resource-Finder.py. You should see some logging output like so...
    Checking for Classic OpsWorks stacks in ap-southeast-2
    Checking for Classic EMR clusters in ap-southeast-1
    Checking for Classic EMR clusters in eu-west-1
    Checking for Classic EMR clusters in sa-east-1
    Checking for Classic Data Pipelines in ap-southeast-2
    Checking for Classic OpsWorks stacks in eu-west-1
    Checking for Classic OpsWorks stacks in ap-southeast-1
    Checking for Classic OpsWorks stacks in sa-east-1
    Checking for Classic Data Pipelines in eu-west-1
    

    After this completes, you should see a new folder on your desktop with a random number. Open this folder up to view the CSV file outputs. These CSV files will list any AWS resources still running off EC2-Classic. If a specific file is blank, this means you do not have anything running on EC2-Classic with the particular service specified in the CSV file name.

    Also, be sure to check the ..._Errors.txt file for any issues during the run.