I have a Python process writing the following example JSON log line:
{"levelname": "DEBUG", "asctime": "2020-02-04 08:37:42,128", "module": "scale_out", "thread": 139793342834496, "filename": "scale_out.py", "lineno": 130, "funcName": "_check_if_can_remove_inactive_components", "message": "inactive_components: set([]), num_of_components_active: 0, max num_of_components_to_keep: 1"}
In the filebeat.yml, I'm trying to exclude all DEBUG logs from being sent into Elasticsearch.
I've tried using the exclude_lines
keyword, but Filebeat still publish these events.
I've also tried using a processor
with drop event
filebeat.inputs:
- type: log
enabled: true
paths:
- /var/log/my_service/*.log
json.keys_under_root: true
json.add_error_key: true
json.message_key: "module"
exclude_lines: ['DEBUG'] # also tried ['.*DEBUG.*']
keep_null: true
processors:
- drop_event:
when:
levelname: 'DEBUG'
Any ideas what am I may be doing wrong?
Well.. It was much more easier (and stupid) that I expected it to be. While the exclude_lines doesn't work (still), I was able to get the drop_event to work.
The problem was that the 'DEBUG' should had been written without quotes.
processors:
- drop_event:
when:
levelname: DEBUG