I run new modules of my system in Google-Container-Engine. I would like to bring stdout and stderr from them (running in pods) to my centralised logstash. Is there an easy way to forward logs from pods to external logging service, e.g., logstash or elasticsearch?
I decided to log directly to elasticsearch, an external virtual machine that can be access at elasticsearch.c.my-project.internal
(I am on Google-Cloud-Platform). It is quite easy:
Setup an ExternalService with name: elasticsearch that points to the elasticsearch instance:
apiVersion: v1
kind: Service
metadata:
name: elasticsearch-logging
namespace: kube-system
labels:
k8s-app: elasticsearch
kubernetes.io/name: "elasticsearch"
spec:
type: ExternalName
externalName: elasticsearch.c.my-project.internal
ports:
- port: 9200
targetPort: 9200
Deploy a fluentd-elasticsearch as a DeamonSet. fluentd-elasticsearch will automatically connect to service with name elasticsearch-logging
(based on a fluentd-elasticsearch deployment defintion :
apiVersion: extensions/v1beta1
kind: DaemonSet
metadata:
name: fluentd-elasticsearch
namespace: kube-system
labels:
tier: monitoring
app: fluentd-logging
k8s-app: fluentd-logging
spec:
template:
metadata:
labels:
name: fluentd-elasticsearch
spec:
containers:
- name: fluentd-elasticsearch
image: gcr.io/google_containers/fluentd-elasticsearch:1.19
volumeMounts:
- name: varlog
mountPath: /var/log
- name: varlibdockercontainers
mountPath: /var/lib/docker/containers
readOnly: true
terminationGracePeriodSeconds: 30
volumes:
- name: varlog
hostPath:
path: /var/log
- name: varlibdockercontainers
hostPath:
path: /var/lib/docker/containers
Use kubectl logs fluentd-elasticsearch-...
to check whether you were able to connect to the elasticsearach instance.