I'm going to fetch messages from a certain topic in Apache Kafka and record them in an Oracle database. The problem is the ConsumeKafka_1_0 1.17.0 processor is not reading the messages that are in the Kafka topic.
I tried to use versions of ConsumeKafka_1_0, ConsumeKafka_2_0 and ConsumeKafka_2_6;
I set Offset Reset to Latest or Earliest;
I modified the Group ID on each request;
I set Honor Transactions to True or False;
My Kafka server is on the same network and doesn't use any credentials of Kerberos or TSL tickets. Configured on default port 9092 and using Zookeper on another server.
I accessed the Broker server, where the topics are and successfully executed the command to list the Topics and Messages.
Other processes here at the company where I work use Kafka without any problems.
Has anyone had this issue and managed to resolve it?
tried to use versions of ConsumeKafka_1_0, ConsumeKafka_2_0 and ConsumeKafka_2_6
Depends what Kafka version you are using. If you are using Kafka broker later than 2.6, then use that. If you are newer than 2.0 broker, but less than 2.6, use 2.0... and so on.
set Offset Reset to Latest or Earliest
Depends if you care about existing data. If so, use earliest.
modified the Group ID on each request
That's a good debugging step, but not necessary all the time. You can use kafka-consumer-groups.sh
command to further debug if the group was actually created in the Kafka cluster.
Honor Transactions to True or False
Depends if your producer uses transactions. Not sure what NiFi will do if you have this set to true, and your producer doesn't use them... Probably safe to leave set to True, though.
For PLAINTEXT Kafka protocol
For simple Kafka consumer debugging
GrokReader
%{GREEDYDATA:message}
For internal Nifi FlowFile output (could use any format, but JSON is good to validate your steps have worked)
JSONRecordSetWriter
Add new processors for parse.failure
(or terminate it under the Relationships tab) and setup success
relationship with click-drag from Consume Processor. You can disable next processor to simply queue up FlowFiles within NiFi.
Then, start the Consume processor, and inspect data in the queue.
Producer command:
echo 'Hello, World' | kcat -P -b localhost:9092 -t foobar