I have created a Kubernetes cluster and one of instance in the cluster is inactive I want to review the configured Kubernetes Engine cluster of an inactive configuration by which command should I check? Should I use this "kubectl config get-contexts"? or kubectl config use-context and kubectl config view?
Am beginner to cloud please anyone explains?
The kubectl config get-context
will not help you debug why the instance is failing. Basically it will just show you the list ot contexts. A context
is a group of cluster access parameters. Each context contains a Kubernetes cluster, a user, and a namespace. The current context
is the cluster that is currently the default for kubectl
. On other hand the kubectl config view
will just print you kubeconfig
settings.
The best way to start is the Kubernestes official documentation. It provides a good basic steps for troubleshoouting your cluster. Some of the steps can be applied to GKE as well as the Kubeadm or Minikube clusters.
If you're using GKE, then you can read the nodes logs from Stackdriver. This document is excellent start when you want to check the logs directly in the log viewer.
If one of your instaces report NotReady
after listing them with kubectl get nodes
I suggest to ssh to that instances and check kubernetes components (kubelet
and kube-proxy
). You can view the GKE nodes from the instances page.
Kube Proxy logs:
/var/log/kube-proxy.log
If you want to check the kubelet
logs, they're a unit in systemd
in COS that can be accessed using jorunactl.
Kubelet logs:
sudo journalctl -u kubelet
For further debugging it is worth mentioning that that GKE master is a node inside a Google managed project and it is different from your cluster project. For the detailed master logs you will have open a google support ticket. Here is more information about how GKE cluster architecture works, in case there's something related to the api-server.
Let me know if that was helpful.