Search code examples
elasticsearchlogstashkibanaelastic-stacklogstash-grok

How to map between logstash extracted fields


I am extracting info from logfiles, but I want to map them together for aggregations, here's a sample logfile:

2017-01-01 07:53:44 [monitor_utils.py] INFO: Crawled iteration for merchant ariika started
2017-01-01 07:53:44 [utils.py] INFO: UpdateCrawlIteration._start_crawl_iteration function took 0.127 s
2017-01-01 07:57:22 [statscollectors.py] INFO: Dumping Scrapy stats:
{'item_scraped_count': 22,
 'invalid_items_count': 84}

I am extracting the merchant name from the first line ariika and items_scraped_counts, invalid_items_count from the last two lines, I have different logfiles for each merchant, and I want to know items scraped count per logfile for each merchant using Kibana.

How to filter between one merchant and another in my case?


Solution

  • if I understand you in a correct way, I believe the source field in kibana can help you. The source field indicates the name of log file.

    using the bar char, you can choose the metric that you want based in your tems scraped count field, and then create a bucket aggregation using the source field. And filter based in merchant field in the kibana search bar.

    Or on top of the first aggregation you can create another aggregation using the merchant field and choose the split bars. without using the kibana search bar.