I have a Spring Boot application running in a Kubernetes cluster and a EFK stack (like ELK, but with instead of Logstash, Fluentd is used as a lightweight alternative to collect logs from all kubernetes pods and sends them to elasticsearch).
In order to adapt the logs to a JSON output, I have used a logstash-logback-encoder library:
<dependency>
<groupId>net.logstash.logback</groupId>
<artifactId>logstash-logback-encoder</artifactId>
<version>4.11</version>
</dependency>
And out of the box I had my logs converted to JSON (which is great).
I log to STDOUT, everything gets picked up and sent to Elasticsearch. No special configuration for logging is needed inside the Spring Boot application.
But the problem I have right now is that while reading my logs in realtime from the STDOUT of the Kubernetes pod, they are very hard to read with all the JSON formatting.
Example:
{"@timestamp":"2018-02-08T12:49:06.080+01:00","@version":1,"message":"Mapped \"{[/error],produces=[text/html]}\" onto public org.springframework.web.servlet.ModelAndView org.springframework.boot.autoconfigure.web.BasicErrorController.errorHtml(javax.servlet.http.HttpServletRequest,javax.servlet.http.HttpServletResponse)","logger_name":"org.springframework.web.servlet.mvc.method.annotation.RequestMappingHandlerMapping","thread_name":"main","level":"INFO","level_value":20000}
{"@timestamp":"2018-02-08T12:49:06.080+01:00","@version":1,"message":"Mapped \"{[/error]}\" onto public org.springframework.http.ResponseEntity<java.util.Map<java.lang.String, java.lang.Object>> org.springframework.boot.autoconfigure.web.BasicErrorController.error(javax.servlet.http.HttpServletRequest)","logger_name":"org.springframework.web.servlet.mvc.method.annotation.RequestMappingHandlerMapping","thread_name":"main","level":"INFO","level_value":20000}
{"@timestamp":"2018-02-08T12:49:06.098+01:00","@version":1,"message":"Mapped URL path [/webjars/**] onto handler of type [class org.springframework.web.servlet.resource.ResourceHttpRequestHandler]","logger_name":"org.springframework.web.servlet.handler.SimpleUrlHandlerMapping","thread_name":"main","level":"INFO","level_value":20000}
{"@timestamp":"2018-02-08T12:49:06.098+01:00","@version":1,"message":"Mapped URL path [/**] onto handler of type [class org.springframework.web.servlet.resource.ResourceHttpRequestHandler]","logger_name":"org.springframework.web.servlet.handler.SimpleUrlHandlerMapping","thread_name":"main","level":"INFO","level_value":20000}
{"@timestamp":"2018-02-08T12:49:06.137+01:00","@version":1,"message":"Mapped URL path [/**/favicon.ico] onto handler of type [class org.springframework.web.servlet.resource.ResourceHttpRequestHandler]","logger_name":"org.springframework.web.servlet.handler.SimpleUrlHandlerMapping","thread_name":"main","level":"INFO","level_value":20000}
{"@timestamp":"2018-02-08T12:49:06.268+01:00","@version":1,"message":"Registering beans for JMX exposure on startup","logger_name":"org.springframework.jmx.export.annotation.AnnotationMBeanExporter","thread_name":"main","level":"INFO","level_value":20000}
{"@timestamp":"2018-02-08T12:49:06.333+01:00","@version":1,"message":"Initializing ProtocolHandler [\"http-nio-8080\"]","logger_name":"org.apache.coyote.http11.Http11NioProtocol","thread_name":"main","level":"INFO","level_value":20000}
{"@timestamp":"2018-02-08T12:49:06.355+01:00","@version":1,"message":"Starting ProtocolHandler [\"http-nio-8080\"]","logger_name":"org.apache.coyote.http11.Http11NioProtocol","thread_name":"main","level":"INFO","level_value":20000}
What I want to do is Log to STDOUT in 'normal non-JSON' format then send logs to Fluentd in JSON format.
I am trying to configure two log appenders (one to STDOUT and another in JSON format for Fluentd) but I am pretty sure that this will duplicate the data (Fluentd will get the JSON format AND the STDOUT).
My plan B is to build one image for deployment (without the JSON format) and another for production but this is more like plan Z, tbh because I want to monitor those pods in production as well.
My question is How can I do this with possibly one log appender OR without duplicating the data in Fluentd. Is there maybe a different approach that I haven't thought of?
Even though I was tempted for a solution that was proposed, at the end I just used jq
, a json parser to view my logs on the cli. I did so in order to avoid duplicating log data and not having to create files nor having to specially configure fluentd for reading logs from files.