I want to send Metribeat data to Kafka. And from kafka to Logstash.
Here is my metricbeat.yml
.
metricbeat.config.modules:
# Glob pattern for configuration loading
path: ${path.config}/modules.d/*.yml
# Set to true to enable config reloading
reload.enabled: false
setup.template.settings:
index.number_of_shards: 1
index.codec: best_compression
setup.dashboards.enabled: false
output.kafka:
hosts: ["kafka:9092"] // I only have one host.
topic: "%{[fields.log_topic]}"
compression: gzip
processors:
- add_host_metadata: ~
- add_cloud_metadata: ~
And from Kakfa to Logstash, this is the configuration file.
input {
kafka {
bootstrap_servers => "localhost:9092"
topics => ["test"]
}
}
output {
elasticsearch {
hosts => ["http://localhost:9200"]
manage_template => false
index => "%{[@metadata][beat]}-%{[@metadata][version]}-%{+YYYY.MM.dd}"
From Logstash, I want to send data to Elasticsearch to be visualized by Kibana.
However, I see no index in Elasticsearch.
I am doing .\metricbeat.exe setup -e
(Windows) before running .\start-service metricbeat
.
Elasticsearch server, Kibana server, Zookeeper server, Kafka server are running fine.
My logstash is looking okay. Below is what I see from my cmd.
[2019-05-23T17:26:51,668][INFO ][org.apache.kafka.common.utils.AppInfoParser] Kafka version : 2.1.0 [2019-05-23T17:26:51,738][INFO ][org.apache.kafka.common.utils.AppInfoParser] Kafka commitId : eec43959745f444f [2019-05-23T17:26:52,208][INFO ][org.apache.kafka.clients.Metadata] Cluster ID: eJYo7GgaTZitGoeiROlk2w [2019-05-23T17:26:52,211][INFO ][logstash.agent ] Successfully started Logstash API endpoint {:port=>9600} [2019-05-23T17:26:52,222][INFO ][org.apache.kafka.clients.consumer.internals.AbstractCoordinator] [Consumer clientId=logstash-0, groupId=logstash] Discovered group coordinator DESKTOP-MOVCIN1:9092 (id: 2147483647 rack: null) [2019-05-23T17:26:52,229][INFO ][org.apache.kafka.clients.consumer.internals.ConsumerCoordinator] [Consumer clientId=logstash-0, groupId=logstash] Revoking previously assigned partitions [] [2019-05-23T17:26:52,231][INFO ][org.apache.kafka.clients.consumer.internals.AbstractCoordinator] [Consumer clientId=logstash-0, groupId=logstash] (Re-)joining group [2019-05-23T17:26:52,274][INFO ][org.apache.kafka.clients.consumer.internals.AbstractCoordinator] [Consumer clientId=logstash-0, groupId=logstash] Successfully joined group with generation 23 [2019-05-23T17:26:52,281][INFO ][org.apache.kafka.clients.consumer.internals.ConsumerCoordinator] [Consumer clientId=logstash-0, groupId=logstash] Setting newly assigned partitions [test-0]
Can anyone give me some guidance?
I finally managed to collect system data with Metricbeat, send them to Logstash via Kafka and store them in Elasticsearch and see them in Kibana.
It isn't an ideal answer yet. I'll update it as I understand better in the future.
For metricbeat.yml
configuration,
output.kafka:
hosts: ["localhost:9092"]
topic: "testkafka" // I created this topic in Kafka earlier.
For Logstash configuration,
input {
kafka {
bootstrap_servers => "localhost:9092"
topics => ["testkafka"]
}
}
output {
elasticsearch {
hosts => ["localhost:9200"]
index => "testkafka"
}
}
At least, with these configurations, I was able to integrate ELK with K(Kafka).