I have created a container service and have set the orchestration to swarm. I have 1 master and 2 agents. I was expecting the swarm to be initiated automatically but it doesn't appear so. I need to remote onto each VM to connect it to a swarm manager.
Whilst I can connect to my master VM via SSH, I don't see how to connect to either of the agent vm's in the scale set.
I've tried the following in git bash, based on the instance names listed in the scale set....
$ ssh moconnor@swarm-agentpool-16065278-vmss_1 -i /c/Users/Matthew.OConnor/azure
which links to my private SSH key, but get the following error....
ssh: Could not resolve hostname swarm-agentpool-16065278-vmss_1: Name or service not known
I assume this is because swarm-agentpool-16065278-vmss_1
is neither a valid ip or dns, but how do I get this value for each VM in the scale set?
The following works for connecting to my master...
ssh [email protected] -i /c/Users/Matthew.OConnor/azure
According to this section in the guide, I should be seeing some inbound NAT rules for each VM in the scale set.
For me this screen is empty...
and it doesn't allow me to add anything due to the following message...
Full virtual machine scale set support for the portal is coming soon. Adding or editing references between load balancers and scale set virtual machines is currently disabled for load balancers that contain an existing association with a scale set.
How do I connect to VM's in a scale set created with container services?
You could ssh to master VM and find the agent private IP on Azure Portal.
Then you could ssh the agent instance, for example.
ssh -i ~/.ssh/id_rsa <username>@10.0.0.5
Note: The id_rsa is same with master VM.