I am launching EMR cluster and while bootstrapping I am assigning a Private IP address to it master node. So this get associated when cluster starts and gets released when cluster gets terminated. How do i guarantee that this IP is always available for my cluster and no other process picks it up even when not in use.
Once I get this Ip, I submit my spark jars to 8998 port of this IP, using Apache livy REST API
So my use case is to expose the IP on master node, so that LIVY APIs can submit job to EMR.
It is not possible to request a specific private IP address for the cluster.
Instead, I would suggest:
cluster.private
)Updating the Record Set can be done programmatically. A script would do something like:
list_clusters()
describe_cluster()
DNS=cluster.private
)DNS
tag is present:
This script would need to be triggered after the cluster is launched.