Search code examples
dockerapache-kafkadocker-composedocker-machineconfluent-platform

Docker Setup - Networking between multiple containers


On my linux server, I am running 3 images -

A) Docker and Zookeeper with this docker-compose file -

version: '2'
services:
  zookeeper:
    image: wurstmeister/zookeeper:3.4.6
    ports:
     - "2181:2181"
  kafka:
    image: wurstmeister/kafka:2.11-2.0.0
    ports:
     - "9092:9092"
    expose:
     - "9093"
    environment:
      KAFKA_ADVERTISED_LISTENERS: INSIDE://kafka:9093,OUTSIDE://localhost:9092
      KAFKA_LISTENER_SECURITY_PROTOCOL_MAP: INSIDE:PLAINTEXT,OUTSIDE:PLAINTEXT
      KAFKA_LISTENERS: INSIDE://0.0.0.0:9093,OUTSIDE://0.0.0.0:9092
      KAFKA_INTER_BROKER_LISTENER_NAME: INSIDE
      KAFKA_ZOOKEEPER_CONNECT: zookeeper:2181 

This will open up the kafka broker to the host machine.

B) JupyterHub

docker run -v /notebooks:/notebooks -p 8000:8000 jupyterhub

C) Confluent Schema Registry (I have not tried it yet, but in my final setup I will have a schema registry container as well)

docker run confluentinc/cp-schema-registry

Both are starting up without any issues. But how do I open up jupyterhub container to kafka container and schema registry ports so that my python scripts can access the brokers.


Solution

  • I'm assuming you want to run your jupyter notebook container on demand whereas your zookeeper and kafka containers will always be running separately? You can create a docker network and join all the containers to this network. Then your containers will be able resolve each other by their names.

    1. Create a network
    2. Specify this network in compose file
    3. When starting your other containers with docker run, use --network option.