Search code examples
elasticsearchkuberneteskibanafluentdefk

Host journal logs no present in EFK Kubernetes stack


I'm using kube-fluentd-operator to aggregate logs using fluentd into Elasticsearch and query them in Kibana.

I can see my application (pods) logs inside the cluster. However I cannot see the journal logs (systemd units, kubelet, etc) from the hosts inside the cluster.

There are no noticeable messages in fluentd's pods logs and the stack works for logs coming from applications. Inside the fluentd container I have access to the /var/log/journal directory (drwxr-sr-x 3 root 101 4096 May 21 12:37 journal). Where should I look next to get the journald logs in my EFK stack?

Here's the kube-system.conf file attached to the kube-system namespace:

<match systemd.** kube.kube-system.** k8s.** docker>
  # all k8s-internal and OS-level logs

  @type elasticsearch
  host "logs-es-http.logs"
  port "9200"
  scheme "https"
  ssl_verify false
  user "u1"
  password "password"
  logstash_format true
  #with_transporter_log true
  #@log_level debug
  validate_client_version true
  ssl_version TLSv1_2
</match>

Minimal, simple, according to the docs.

Is it possible that my search terms are wrong? What should I search for in order to get the journal logs?


Solution

  • After having tried every possible solution (from enabling log_level debug, to only having the kube-system namespace monitored, to adding runAsGroup: 101 to the containers) all I was left with was changing what I was using for log aggregation and decided to switch from that operator to the DaemonSet provided by fluent themselves: https://github.com/fluent/fluentd-kubernetes-daemonset

    This switch has proved successful and the search of the systemd units works from inside the EFK stack.