The Packer code is below
variable "ami_id" {
type = string
default = "ami-xxxxxxxxxx"
}
variable "environment" {
type = string
default = "DEMO"
}
variable "ec2_size" {
type = string
default = "t2.micro"
}
variable "ssh-user" {
type = string
default = "ec2-user"
}
variable "app_name" {
type = string
default = "Test App"
}
locals {
app_name = "Test App"
}
source "amazon-ebs" "rhel8" {
ami_name = "PACKER-POC-${local.app_name}"
instance_type = "${var.ec2_size}"
#region = "${var.region}"
source_ami = "${var.ami_id}"
ssh_username = "${var.ssh-user}"
vpc_id = "vpc-xxxxxxxxxx"
subnet_id = "subnet-xxxxxxxxxx"
ssh_timeout = "5m"
iam_instance_profile = "ASMEC2InstanceProfile"
tags = {
Env = "${var.environment}"
Name = "PACKER-${var.environment}-${var.app_name}"
}
}
build {
sources = ["source.amazon-ebs.rhel8"]
provisioner "shell" {
inline = "mkdir -p /tmp/temp123/"
}
provisioner "file" {
source = "user-data.sh"
destination = "/tmp/temp123/user-data.sh"
}
provisioner "shell" {
inline = [
"chmod 755 /tmp/temp123/user-data.sh"
]
}
provisioner "shell" {
script = "user-data.sh"
}
post-processor "shell-local" {
inline = ["echo Finished with the Test Application Installation"]
}
}
However, I am getting an error like below:
amazon-ebs.rhel8: output will be in this color.
==> amazon-ebs.rhel8: Prevalidating any provided VPC information
==> amazon-ebs.rhel8: Prevalidating AMI Name: PACKER-POC-TestApp
amazon-ebs.rhel8: Found Image ID: ami-034197f3f8ec3a4c8
==> amazon-ebs.rhel8: Creating temporary keypair: packer_640f06fc-a274-aff8-f15e-cd1ac572ee40
==> amazon-ebs.rhel8: Creating temporary security group for this instance: packer_640f06ff-8d73-979b-69ba-b432d60bd675
==> amazon-ebs.rhel8: Authorizing access to port 22 from [0.0.0.0/0] in the temporary security groups...
==> amazon-ebs.rhel8: Launching a source AWS instance...
amazon-ebs.rhel8: Instance ID: i-0bf8682dc88b27e3b
==> amazon-ebs.rhel8: Waiting for instance (i-0bf8682dc88b27e3b) to become ready...
==> amazon-ebs.rhel8: Using SSH communicator to connect: 10.10.164.253
==> amazon-ebs.rhel8: Waiting for SSH to become available...
==> amazon-ebs.rhel8: Connected to SSH!
==> amazon-ebs.rhel8: Provisioning with shell script: /var/folders/m4/mxxzm10d2d774n5j9yn2l54m0000gq/T/packer-shell3653121735
==> amazon-ebs.rhel8: bash: /tmp/script_1399.sh: Permission denied
==> amazon-ebs.rhel8: Provisioning step had errors: Running the cleanup provisioner, if present...
==> amazon-ebs.rhel8: Terminating the source AWS instance...
==> amazon-ebs.rhel8: Cleaning up any extra volumes...
==> amazon-ebs.rhel8: No volumes to clean up, skipping
==> amazon-ebs.rhel8: Deleting temporary security group...
==> amazon-ebs.rhel8: Deleting temporary keypair...
Build 'amazon-ebs.rhel8' errored after 4 minutes 9 seconds: Script exited with non-zero exit status: 126. Allowed exit codes are: [0]
==> Wait completed after 4 minutes 9 seconds
==> Some builds didn't complete successfully and had errors:
--> amazon-ebs.rhel8: Script exited with non-zero exit status: 126. Allowed exit codes are: [0]
==> Builds finished but no artifacts were created.
Goal to accomplish: I want to spin up EC2 and have user-data shell script executed and then an AMI be created.
I am keen to understand what I am doing incorrectly as I ideally want my application workload to be baked into an AMI so I could use that to spin up EC2s
I wrote the app.pkr.hcl code and performed the packer validate command which is erroring it out.
The packer config file and shell script are in same folder
ls -larth
total 24
-rwxr-xr-x@ 1 testuser staff 6.5K Mar 10 11:21 user-data.sh
drwxr-xr-x 10 testuser staff 320B Mar 10 21:08 ..
drwxr-xr-x 4 testuser staff 128B Mar 12 13:56 .
-rw-r--r-- 1 testuser staff 1.4K Mar 13 14:00 app.pkr.hcl
P.S: I added updated script and error following Marko's suggestion. I added instance profile name but I am baffled why I am getting permission denied.
There are a couple of things to fix to make this work:
shell
provisioner is running a single command and hence cannot use the script
argument but rather the inline
argumentapt-get
part will not work with the wrong package managerscript
argument is supposed to represent the path to the script locallyFor the third point, the documentation says:
script
(string) - The path to a script to upload and execute in the machine. This path can be absolute or relative. If it is relative, it is relative to the working directory when Packer is executed.
That means that the last shell
provisioner has to show to the script in the same directory based on your question. The file that passes the validation looks like this:
variable "ami_id" {
type = string
default = "ami-xxxxxxxxxx"
}
variable "environment" {
type = string
default = "DEMO"
}
variable "ec2_size" {
type = string
default = "t2.micro"
}
variable "ssh-user" {
type = string
default = "ec2-user"
}
variable "app_name" {
type = string
default = "Test App"
}
locals {
app_name = "Test App"
}
source "amazon-ebs" "rhel8" {
ami_name = "PACKER-POC-${local.app_name}"
instance_type = "${var.ec2_size}"
#region = "${var.region}"
source_ami = "${var.ami_id}"
ssh_username = "${var.ssh-user}"
vpc_id = "vpc-xxxxxxxxxx"
subnet_id = "subnet-xxxxxxxxxx"
tags = {
Env = "${var.environment}"
Name = "PACKER-${var.environment}-${var.app_name}"
}
}
build {
sources = ["source.amazon-ebs.rhel8"]
provisioner "shell" {
inline = ["mkdir -p /tmp/temp123/"]
}
provisioner "file" {
source = "user-data.sh"
destination = "/tmp/temp123/user-data.sh"
}
provisioner "shell" {
inline = [
"chmod 755 /tmp/temp123/user-data.sh",
"echo Installing Updates",
"sudo yum update",
"sudo yum install -y nginx"
]
}
provisioner "shell" {
script = "user-data.sh"
}
post-processor "shell-local" {
inline = ["echo Finished with the Test Application Installation"]
}
}