Search code examples
kuberneteskubectlgoogle-kubernetes-engine

Connect local instance of kubectl to GKE cluster without using gcloud tool?


Does anyone know how to connect a local instance of kubectl to a Google Kubernetes Engine (GKE) cluster, without using the gcloud tool locally?

For example:

If you use the gcloud tool with this command:

gcloud container clusters get-credentials NAME [--zone=ZONE, -z ZONE] [GCLOUD_WIDE_FLAG …]

You'll find a user like this in ~/.kube/config:

- name: gke_myproj_myzone
  user:
    auth-provider:
      config:
        access-token: TOKENSTRING
        cmd-args: config config-helper --format=json
        cmd-path: /google/google-cloud-sdk/bin/gcloud
        expiry: 2018-01-22 18:05:46
        expiry-key: '{.credential.token_expiry}'
        token-key: '{.credential.access_token}'
      name: gcp

As you can see, the default values, the gcloud tool provides require the glcoud tool as an auth-provider to log in to your cluster.

Now, what I'm looking for is a way to connect kubectl to a cluster on a machine, that does not have gcloud installed.


Solution

  • The easiest way to achieve this is by copying the ~/.kube/config file (from a gcloud authenticated instance) to this directory $HOME/.kube in your local instance (laptop).

    But first, and using the authenticated instance, you would have to enable legacy cluster per this document by running these commands:

    gcloud config set container/use_client_certificate True
    export CLOUDSDK_CONTAINER_USE_CLIENT_CERTIFICATE=True
    

    Then execute the get-credentials command, and copy the file.

    gcloud container clusters get-credentials NAME [--zone=ZONE, -z ZONE] [GCLOUD_WIDE_FLAG …]
    

    Note that you may have to run the get-credentials command, and copy the config file every time authentication tokens (saved in the config file) expire.