So I have configuration like this:
scrape_configs:
- job_name: flog_scrape
docker_sd_configs:
- host: unix:///var/run/docker.sock
refresh_interval: 5s
relabel_configs:
- source_labels: ['__meta_docker_container_name']
regex: '/(.*)'
target_label: 'container'
It is all good and well, but on the target machine I have interesting containers and garbage containers, and I only want logs from containers called interesting_something
What do I do to drop all lines whose container label does not start with "interesting_" / does not match "interesting_.*"?
I've found ways to drop label "container" conditionally; I've also found ways to filter log messages based on their content; I've also found a way to drop labels later on the pipeline stage; but I've NOT found any way to filter/keep/drop entire log line based on label contents yet, that's why I'm asking.
Adding to that I am bitten constantly by the error "pipeline stage must contain only one key" - of which there are many reports explaining how it is really a YAML indenting error yet never describing what is the actual formatting solution is. So really I'm looking for a solution which will work when copied verbatim.
Since all of interesting containers were brought by a single docker-compose in my case, I've just filtered docker_sd_configs
by docker-compose project name:
scrape_configs:
- job_name: flog_scrape
docker_sd_configs:
- host: unix:///var/run/docker.sock
refresh_interval: 1s
filters:
- name: label
values: ['com.docker.compose.project=compose-web']
relabel_configs:
- source_labels: ['__meta_docker_container_name']
regex: '/(.*)'
target_label: 'container'
pipeline_stages:
- static_labels: # Needs four whitespace indentation below, will complain otherwise
host: ${VMHOST} # Env var defined externally, passed via -config.expand-env=true
Still have no idea if it has a Pipeline solution.