Search code examples
kubernetesazure-aks

Shell (ssh) into Azure AKS (Kubernetes) cluster worker node


I have a Kubernetes cluster in Azure using AKS and I'd like to 'login' to one of the nodes. The nodes do not have a public IP.

Is there a way to accomplish this?


Solution

  • The procedure is longly decribed in an article of the Azure documentation: https://learn.microsoft.com/en-us/azure/aks/ssh. It consists of running a pod that you use as a relay to ssh into the nodes, and it works perfectly fine:

    You probably have specified the ssh username and public key during the cluster creation. If not, you have to configure your node to accept them as the ssh credentials:

    $ az vm user update \
      --resource-group MC_myResourceGroup_myAKSCluster_region \
      --name node-name \
      --username theusername \
      --ssh-key-value ~/.ssh/id_rsa.pub
    

    To find your nodes names:

    az vm list --resource-group MC_myResourceGroup_myAKSCluster_region -o table
    

    When done, run a pod on your cluster with an ssh client inside, this is the pod you will use to ssh to your nodes:

    kubectl run -it --rm my-ssh-pod --image=debian
    # install ssh components, as their is none in the Debian image
    apt-get update && apt-get install openssh-client -y
    

    On your workstation, get the name of the pod you just created:

    $ kubectl get pods
    

    Add your private key into the pod:

    $ kubectl cp ~/.ssh/id_rsa pod-name:/id_rsa
    

    Then, in the pod, connect via ssh to one of your node:

    ssh -i /id_rsa [email protected]
    

    (to find the nodes IPs, on your workstation):

    az vm list-ip-addresses --resource-group MC_myAKSCluster_myAKSCluster_region -o table