Search code examples
elasticsearchlogstashkibanaelastic-stack

How to Analyze logs from multiple sources in ELK


I have started working on ELK recently and have a doubt regarding handling of multiple types of logs. I have two sets of logs on my server that I want to analyse, one from my android application and the other from my website. I have successfully transferred logs from this server via filebeat to the ELK server. I have created two filters for either types of logs and have successfully imported these logs into logstash and then Kibana.

This link helped do the above stuff.

https://www.digitalocean.com/community/tutorials/how-to-install-elasticsearch-logstash-and-kibana-elk-stack-on-centos-7

The above link directs to use the logs in the filebeat index in Kibana and start analysing(I successfully did for one type of logs). But the problem that I am facing is that since both these logs are very different, they need to be analysed differently. How do I do this in Kibana. Should I create multiple filebeat indexes there and import them, or should it be just one single index, or some other way. I am not very clear on this(could not find much documentation), hence would request to please help and guide me here.


Solution

  • Elasticsearch organizes by index and type. Elastic used to compare these to SQL concepts, but now offers a new explanation.

    Since you say that the logs are very different, Elastic is saying that you should use different indexes.

    In Kibana, the visualization is tied to an index. If you had one panel from each index, you can show them both on the same dashboard.