The EC2 Classic Resource Finder script is AWS's recommended script to check for any resources you may be using in AWS on the EC2-Classic network. EC2-Classic is being "discontinued" as of 8/15/22.
Having never used Python, pip, or Boto3, I want to document how I set up and ran the script on my Macbook in case it will help anyone else prior to the 8/15 deadline. I imagine some folks may be scrambling as the deadline nears and hopefully this will save you a few hours.
This assumes you are using a recent version of MacOS (I am on Monterrey 12.4 but I suspect earlier versions will work the same) and have access to your AWS account with the ability to create new IAM users.
The script requires Boto3, the AWS SDK for Python. Naturally, this also requires Python as well as Pip (the package installer for Python).
curl https://bootstrap.pypa.io/get-pip.py -o get-pip.py
python3 get-pip.py
pip install boto3
This will expedite setting up your AWS credentials with Boto3. The installation file is on the right-margin of the AWS CLI page.
In order to execute the Python script, you need to create an IAM user with the requisite permissions in the AWS account. There are many ways to set up the IAM user but, my simple approach was:
The Github readme file lists the resources that the IAM user will need but doesn't just give you the direct JSON policy. So, here it is:
Log into AWS, type "IAM" on the topbar search, select "IAM" under Services, then "Policies" on the left sidebar menu. Then create a new policy. Click the "JSON" tab and enter the following JSON in the text area.
"Version": "2012-10-17",
"Statement": [
{
"Effect": "Allow",
"Action": [
"autoscaling:DescribeAutoScalingGroups",
"datapipeline:GetPipelineDefinition",
"datapipeline:ListPipelines",
"ec2:DescribeAccountAttributes",
"ec2:DescribeAddresses",
"ec2:DescribeInstances",
"ec2:DescribeRegions",
"ec2:DescribeSecurityGroups",
"ec2:DescribeVpcClassicLink",
"elasticbeanstalk:DescribeConfigurationSettings",
"elasticbeanstalk:DescribeEnvironments",
"elasticache:DescribeCacheClusters",
"elasticloadbalancing:DescribeLoadBalancers",
"elasticmapreduce:DescribeCluster",
"elasticmapreduce:ListBootstrapActions",
"elasticmapreduce:ListClusters",
"elasticmapreduce:ListInstanceGroups",
"rds:DescribeDBInstances",
"redshift:DescribeClusters",
"opsworks:DescribeStacks",
"sts:GetCallerIdentity"
],
"Resource": [
"*"
]
}
]
}
Note: If you are using "Elastic Beanstalk", you will additionally need to add the other bulleted resources per the readme file in a similar fashion in the Actions
array above.
Then, go through the remaining steps, give it a name (e.g. EC2ClassicResourceFinderScriptPolicy
), then save:
EC2ClassicResourceFinderScriptUser
) and attach the newly created policy to this user.aws configure
us-east-1
). For output format, type table
(this will result in outputting CSV files after running the script).cd desktop
)ls
and ensure the py-Classic-Resource-Finder.py
script is there.python3 py-Classic-Resource-Finder.py
. You should see some logging output like so...Checking for Classic OpsWorks stacks in ap-southeast-2
Checking for Classic EMR clusters in ap-southeast-1
Checking for Classic EMR clusters in eu-west-1
Checking for Classic EMR clusters in sa-east-1
Checking for Classic Data Pipelines in ap-southeast-2
Checking for Classic OpsWorks stacks in eu-west-1
Checking for Classic OpsWorks stacks in ap-southeast-1
Checking for Classic OpsWorks stacks in sa-east-1
Checking for Classic Data Pipelines in eu-west-1
After this completes, you should see a new folder on your desktop with a random number. Open this folder up to view the CSV file outputs. These CSV files will list any AWS resources still running off EC2-Classic. If a specific file is blank, this means you do not have anything running on EC2-Classic with the particular service specified in the CSV file name.
Also, be sure to check the ..._Errors.txt
file for any issues during the run.