Search code examples
open-telemetrygrafana-loki

Grafana Loki error unexpected "=" in label set


I’m trying to write Opentelemetry logs to Loki pushing them from my dotnet service directly to Loki with Opentelemetry packages, but ran into a problem. When I’m trying to get logs in grafana I get an error

could not write JSON response: 1:2: parse error: unexpected "=" in label set, expected identifier or "}"

After researching I realized that the problem was only with some of the log entries. Service have a metrics endpoint for prometheus. And when Prometheus collects metrics, the service writes a log entry like this:

RequestPath:/api/metrics RequestId:8000886a-0002-f400-b63f-84710c7967bb Microsoft.AspNetCore.Hosting.Diagnostics - Request finished HTTP/2 GET https://service/api/metrics - 200 - application/openmetrics-text;+version=1.0.0;+charset=utf-8 7.6416ms

I assume that loki use string “application/openmetrics-text;+version=1.0.0;+charset=utf-8” as common label and falling when I’m trying to get logs.

Does loki have some settings to do with this case? This label creates automatically by Opentelemetry package, so I don’t know what to do with it. I can disable Microsoft.AspNetCore.Hosting.Diagnostics logs but I think I need to resolve this issue on loki side or on the Otel package side.


Solution

  • There seems to be an issue with empty attributes, and potentially some characters. See this post for the case with empty attributes: https://community.grafana.com/t/otlp-ingestion-from-otel-collector-to-loki-seems-to-corrupt-my-data/140588/3

    It worked for me just now when using OpenTelemetry Collector, note the "attributes/empty" processor. It essentially removes the Empty() attributes from the data before it's sent to Loki.

    The error itself seems to be happening on Grafana side.

    receivers:
      otlp:
        protocols:
          grpc:
            endpoint: 0.0.0.0:4317
          http:
            endpoint: 0.0.0.0:4318
    
    exporters:
      debug:
      otlphttp/logs:
        endpoint: "http://loki:3100/otlp"
        tls:
          insecure: true
      otlp:
        endpoint: "tempo:4317"
        tls:
          insecure: true
        sending_queue:
          num_consumers: 4
          queue_size: 100
        retry_on_failure:
          enabled: true
      prometheus:
        endpoint: '0.0.0.0:8889'
    
    processors:
      batch:
      memory_limiter:
        # 80% of maximum memory up to 2G
        limit_mib: 400
        # 25% of limit up to 2G
        spike_limit_mib: 100
        check_interval: 5s
      attributes/empty:
        actions:
          - key: "empty"
            # match empty string
            pattern: ^\s*$
            action: delete
    
    service:
      pipelines:
        traces:
          receivers: [otlp]
          processors: [memory_limiter, batch]
          exporters: [otlp,debug]
        metrics:
          receivers: [otlp]
          processors: [memory_limiter, batch]
          exporters: [prometheus,debug]
        logs:
          receivers: [otlp]
          processors: [memory_limiter, attributes/empty, batch]
          exporters: [debug,otlphttp/logs]