Search code examples
dockerelasticsearchfilebeat

How to collect docker logs using Filebeats?


I am trying to collect this kind of logs from a docker container:

[1620579277][642e7adc-74e1-4b89-a705-d271846f7ebc][channel1] 
[afca2a976fa482f429fff4a38e2ea49f337a8af1b5dca0de90410ecc792fd5a4][usecase_cc][set] ex02 set

[1620579277][ac9f99b7-0126-45ed-8a74-6adc3a9d6bc5][channel1] 
[afca2a976fa482f429fff4a38e2ea49f337a8af1b5dca0de90410ecc792fd5a4][usecase_cc][set][Transaction] Aval 
=201 Bval =301 after performing the transaction

[1620579277][9211a9d4-3fe6-49db-b245-91ddd3a11cd3][channel1] 
[afca2a976fa482f429fff4a38e2ea49f337a8af1b5dca0de90410ecc792fd5a4][usecase_cc][set][Transaction] 
Transaction makes payment of X units from A to B

[1620579280][0391d2ce-06c1-481b-9140-e143067a9c2d][channel1] 
[1f5752224da4481e1dc4d23dec0938fd65f6ae7b989aaa26daa6b2aeea370084][usecase_cc][get] Query Response: 
{"Name":"a","Amount":"200"}

I have set the filebeat.yml in this way:

    filebeat.inputs:
- type: container
    paths:
      - '/var/lib/docker/containers/container-id/container-id.log'

processors:
- add_docker_metadata:
     host: "unix:///var/run/docker.sock"
- dissect:
     tokenizer: '{"log":"[%{time}][%{uuid}][%{channel}][%{id}][%{chaincode}][%{method}] %{specificinfo}\"\n%{}'
     field: "message"       
     target_prefix: ""

output.elasticsearch:
  hosts: ["elasticsearch:9200"]
  username: "elastic"
  password: "changeme"
  indices:
      - index: "filebeat-%{[agent.version]}-%{+yyyy.MM.dd}"
logging.json: true
logging.metrics.enabled: false

Although elasticsearch and kibana are deployed successfully, I am getting this error when a new log is generated:

{"error":{"root_cause":[{"type":"index_not_found_exception","reason":"no such index

[filebeat]","resource.type":"index_or_alias","resource.id":"filebeat","index_uuid":"_na_", 
"index":"filebeat"}],"type":"index_not_found_exception","reason":"no such index 

[filebeat]","resource.type":"index_or_alias","resource.id":"filebeat","index_uuid":"_na_", 
"index":"filebeat"},"status":404}

Note: I am using version 7.12.1 and Kibana, Elastichsearch and Logstash are deployed in docker.


Solution

  • I have used logstash as alternative way instead filebeat. However, a mistake was made by incorrectly mapping the path where the logs are obtained from, in the filebeat configuration file. To solve this issue

    1. I have created an enviroment variable to point to right place:

    enter image description here

    1. I passed the environment variable as part of the docker volume:

    enter image description here

    1. I have pointed the path of the configuration file to the path of the volume inside the container:

    enter image description here