I'm new to fluentd and I would like to parse a multi-level nested escaped JSON strings inside JSON.
My messages look like:
{"log":"HELLO WORLD\n","stream":"stdout","time":"2019-05-23T15:40:54.298531098Z"}
{"log":"{\"appName\":\"adapter\",\"time\":\"2019-05-23T15:40:54.299\",\"message\":\"{\\\"level\\\":\\\"info\\\",\\\"message\\\":\\\"Awaiting Messages from queue...\\\"}\"}\n","stream":"stdout","time":"2019-05-23T15:40:54.2996761Z"}
The first message get parsed correctly but the second one got ignored and I guess it's because of an error in parsing format
Here is my source:
<source>
@id fluentd-containers.log
@type tail
path /var/log/containers/*.log
pos_file /var/log/containers.log.pos
tag raw.kubernetes.*
read_from_head true
<parse>
@type multi_format
<pattern>
format json
time_key time
time_format %Y-%m-%dT%H:%M:%S.%NZ
</pattern>
<pattern>
format /^(?<time>.+) (?<stream>stdout|stderr) [^ ]* (?<log>.*)$/
time_format %Y-%m-%dT%H:%M:%S.%N%:z
</pattern>
</parse>
</source>
Here is what I tried:
<filter **>
@type parser
key_name log
reserve_data true
remove_key_name_field true
hash_value_field parsed_log
<parse>
@type json
</parse>
</filter>
i actually just want to parse this log message:
{
"log":"{\"appName\":\"dedge-adapter\",\"time\":\"2019-05-24T02:39:12.242\",\"message\":\"{\\\"level\\\":\\\"warn\\\",\\\"status\\\":401,\\\"method\\\":\\\"GET\\\",\\\"path\\\":\\\"/api/v1/bookings\\\",\\\"requestId\\\":\\\"782a470b-9d62-43d3-9865-1b67397717d4\\\",\\\"ip\\\":\\\"90.79.204.18\\\",\\\"latency\\\":0.097897,\\\"user-agent\\\":\\\"PostmanRuntime/7.11.0\\\",\\\"message\\\":\\\"Request\\\"}\"}\n",
"stream":"stdout",
"time":"2019-05-24T02:39:12.242383376Z"
}
Do you have multiple format log field? If so, you can use https://github.com/repeatedly/fluent-plugin-multi-format-parser
<source>
@type dummy
tag dummy
dummy [
{"log":"HELLO WORLD\n","stream":"stdout","time":"2019-05-23T15:40:54.298531098Z"},
{"log":"{\"appName\":\"adapter\",\"time\":\"2019-05-23T15:40:54.299\",\"message\":\"{\\\"level\\\":\\\"info\\\",\\\"message\\\":\\\"Awaiting Messages from queue...\\\"}\"}\n","stream":"stdout","time":"2019-05-23T15:40:54.2996761Z"}
]
</source>
<filter dummy>
@type parser
key_name log
reserve_data true
remove_key_name_field true
<parse>
@type multi_format
<pattern>
format json
</pattern>
<pattern>
format none
</pattern>
</parse>
</filter>
<filter dummy>
@type parser
key_name message
reserve_data true
remove_key_name_field true
<parse>
@type multi_format
<pattern>
format json
</pattern>
<pattern>
format none
</pattern>
</parse>
</filter>
<match dummy>
@type stdout
</match>
Output:
2019-06-03 11:41:13.022468253 +0900 dummy: {"stream":"stdout","time":"2019-05-23T15:40:54.298531098Z","message":"HELLO WORLD\n"}
2019-06-03 11:41:14.024253824 +0900 dummy: {"stream":"stdout","time":"2019-05-23T15:40:54.2996761Z","appName":"adapter","level":"info","message":"Awaiting Messages from queue..."}