Search code examples
grafanagrafana-lokilogql

Grafana Loki LogQL: How to parse log lines with different log formats?


We have different type of logs for one and the same application. Some are coming from our application which logs in a JSON format and others are different types of log messages.

For example these 3 log lines:

"{\"written_at\": \"2022-03-30T07:51:04.934Z\", \"written_ts\": 1648626664934052000, \"msg\": \"Step 'X' started at 2022-03-30 07:51:04\", \"type\": \"log\", \"logger\": \"my-logger\", \"thread\": \"MainThread\", \"level\": \"DEBUG\", \"module\": \"my.module\", \"line_no\": 48}\n"
"                    ERROR    Data processing error: Did not work       \n"
"FileNotFoundError: [Errno 2] No such file or directory: '/local.json'\n"

To parse our application JSON logs we perform the following LogQL query:

| json log="log" 
| line_format "{{.log}}"
| json | line_format "{{.msg}}"
| __error__ != "JSONParserErr"

As our query already states, we can not parse the other line logs since they are not in JSON format.

Can we define different parsing and formatting depending on conditions? Or as fallback when the JSONParserErr happens?


Solution

  • Not sure if you managed to get an answer to this, as I'm looking to see if this is possible in a single query, however you can do this with multiple queries…

    For the JSON rows

    | json log="log" 
    | line_format "{{.log}}"
    | json
    | line_format "{{.msg}}"
    | __error__ != "JSONParserErr"
    # … more processing
    

    For the non-JSON rows…

    {swarm_stack="apiv2-acme", swarm_service="apiv2-acme_tenant-import"}
    | json
    | __error__ = "JSONParserErr"
    | drop __error__
    # … more processing