I'm running in Docker for Windows and here's my NiFi setup:
Details on the PublishKafka processor:
Details on the ConsumeKafka processor:
Here is my docker-compose file (note: 192.168.1.50 is my static internal host IP):
version: '3'
services:
Jenkins:
container_name: Jenkins
restart: on-failure
depends_on:
- NiFi
image: jenkins:latest
ports:
- "32779:50000"
- "32780:8080"
NiFi:
container_name: NiFi
image: xemuliam/nifi:latest
restart: on-failure
depends_on:
- kafka
ports:
- "32784:8089"
- "32783:8080"
- "32782:8081"
- "32781:8443"
labels:
com.foo: myLabel
zookeeper:
container_name: Zookeeper
image: wurstmeister/zookeeper
restart: on-failure
#network_mode: host
ports:
- "2181:2181"
kafka:
#container_name: Kafka
image: wurstmeister/kafka
depends_on:
- zookeeper
#restart: on-failure
#network_mode: host
ports:
- "9092"
environment:
#KAFKA_ADVERTISED_LISTENERS: PLAINTEXT://192.168.1.50:9092
#KAFKA_AUTO_CREATE_TOPICS_ENABLE: "true"
KAFKA_CREATE_TOPICS: "MainIngestionTopic:1:1"
KAFKA_ZOOKEEPER_CONNECT: 192.168.1.50:2181
KAFKA_ADVERTISED_LISTENERS: INSIDE://:9092,OUTSIDE://192.168.1.50:9094
KAFKA_LISTENERS: INSIDE://:9092,OUTSIDE://:9094
KAFKA_LISTENER_SECURITY_PROTOCOL_MAP: INSIDE:PLAINTEXT,OUTSIDE:PLAINTEXT
KAFKA_INTER_BROKER_LISTENER_NAME: INSIDE
volumes:
- ./var/run/docker.sock:/var/run/docker.sock
When I tail the Kafka container log I can see that my topic was created successfully from docker-compose.
Messages are successfully delivered to the PublishKafka processor in NiFi, but then fail to publish. The ConsumeKafka processor, which is subscribed to the same topic, never receives the message.
The NiFi container log shows the following:
2018-05-28 19:46:18,792 ERROR [Timer-Driven Process Thread-1] o.a.n.p.kafka.pubsub.PublishKafka PublishKafka[id=b2503f49-acc9-38f5-86f9-5029e2768b68] Failed to send all message for StandardFlowFileRecord[uuid=b3f6f818-34d3-42a9-9d6e-636cf17eb138,claim=StandardContentClaim [resourceClaim=StandardResourceClaim[id=1527533792820-1, container=default, section=1], offset=5, length=5],offset=0,name=8151630985100,size=5] to Kafka; routing to failure due to org.apache.kafka.common.errors.TimeoutException: Failed to update metadata after 5000 ms.: org.apache.kafka.common.errors.TimeoutException: Failed to update metadata after 5000 ms.
org.apache.kafka.common.errors.TimeoutException: Failed to update metadata after 5000 ms.
2018-05-28 19:46:18,792 INFO [Timer-Driven Process Thread-1] o.a.kafka.clients.producer.KafkaProducer Closing the Kafka producer with timeoutMillis = 5000 ms.
I tried publishing to the topic from inside the Kafka container itself but that also failed:
I have combed documentation and read many threads trying to resolve this issue but it's still an issue. Any help would be greatly appreciated!
You can't use localhost in the "Kafka Brokers" property in NiFi unless the broker was actually running on the same host where NiFi was running. Since you have each service inside a docker container the container for kafka must have a specific hostname or ip that can be used.