Search code examples
apache-kafkaconfluent-platformksqldbdebezium

how to create subject for ksqldb from kafka tapic


I use Mysql database. Suppose I have a table for orders. And using debezium mysql connect for Kafka, the order topic has been created. But I have trouble creating a stream in ksqldb.

CREATE STREAM orders WITH (
    kafka_topic = 'myserver.mydatabase.orders',
    value_format = 'avro'
);

my docker-compose file look like this

  zookeeper:
    image: confluentinc/cp-zookeeper:latest
    container_name: zookeeper
    privileged: true
    ports:
      - "2181:2181"
    environment:
      ZOOKEEPER_CLIENT_PORT: 2181
      ZOOKEEPER_TICK_TIME: 2000

  kafka:
    image: confluentinc/cp-kafka:latest
    container_name: kafka
    depends_on:
      - zookeeper
    ports:
      - '9092:9092'
    environment:
      KAFKA_ZOOKEEPER_CONNECT: zookeeper:2181
      KAFKA_ADVERTISED_LISTENERS: PLAINTEXT://kafka:9092
      KAFKA_OFFSETS_TOPIC_REPLICATION_FACTOR: 1
      KAFKA_TRANSACTION_STATE_LOG_MIN_ISR: 1
      KAFKA_TRANSACTION_STATE_LOG_REPLICATION_FACTOR: 1


  schema-registry:
    image: confluentinc/cp-schema-registry:latest
    container_name: schema-registry
    depends_on:
      - kafka
      - zookeeper
    ports:
      - "8081:8081"
    environment:
      SCHEMA_REGISTRY_KAFKASTORE_CONNECTION_URL: "zookeeper:2181"
      SCHEMA_REGISTRY_HOST_NAME: schema-registry

  kafka-connect:
    hostname: kafka-connect
    image: confluentinc/cp-kafka-connect:latest
    container_name: kafka-connect
    ports:
      - 8083:8083
    depends_on:
      - schema-registry
    environment:
      CONNECT_BOOTSTRAP_SERVERS: kafka:9092
      CONNECT_REST_PORT: 8083
      CONNECT_GROUP_ID: "quickstart-avro"
      CONNECT_CONFIG_STORAGE_TOPIC: "quickstart-avro-config"
      CONNECT_OFFSET_STORAGE_TOPIC: "quickstart-avro-offsets"
      CONNECT_STATUS_STORAGE_TOPIC: "quickstart-avro-status"
      CONNECT_CONFIG_STORAGE_REPLICATION_FACTOR: 1
      CONNECT_OFFSET_STORAGE_REPLICATION_FACTOR: 1
      CONNECT_STATUS_STORAGE_REPLICATION_FACTOR: 1
      CONNECT_KEY_CONVERTER: "org.apache.kafka.connect.json.JsonConverter"
      CONNECT_VALUE_CONVERTER: "org.apache.kafka.connect.json.JsonConverter"
      CONNECT_INTERNAL_KEY_CONVERTER: "org.apache.kafka.connect.json.JsonConverter"
      CONNECT_INTERNAL_VALUE_CONVERTER: "org.apache.kafka.connect.json.JsonConverter"
      CONNECT_REST_ADVERTISED_HOST_NAME: "kafka-connect"
      CONNECT_LOG4J_ROOT_LOGLEVEL: DEBUG
      CONNECT_PLUGIN_PATH: "/usr/share/java,/etc/kafka-connect/jars"
    volumes:
      - $PWD/kafka/jars:/etc/kafka-connect/jars

  ksqldb-server:
    image: confluentinc/ksqldb-server:latest
    hostname: ksqldb-server
    container_name: ksqldb-server
    depends_on:
      - kafka
    ports:
      - "8088:8088"
    environment:
      KSQL_LISTENERS: http://0.0.0.0:8088
      KSQL_BOOTSTRAP_SERVERS: "kafka:9092"
      KSQL_KSQL_LOGGING_PROCESSING_STREAM_AUTO_CREATE: "true"
      KSQL_KSQL_LOGGING_PROCESSING_TOPIC_AUTO_CREATE: "true"
      KSQL_KSQL_SCHEMA_REGISTRY_URL: "http://schema-registry:8081"
      KSQL_CONNECT_VALUE_CONVERTER_SCHEMA_REGISTRY_URL: "http://schema-registry:8081"

  ksqldb-cli:
    image: confluentinc/ksqldb-cli:latest
    container_name: ksqldb-cli
    depends_on:
      - kafka
      - ksqldb-server
      - schema-registry
    entrypoint: /bin/sh
    tty: true

subject must be created for this table first. What is the difference between the avro, json?


Solution

  • I solved the issue. Using this configuration, you can send mysql table in the topic without the before and next state.

    CREATE SOURCE CONNECTOR final_connector WITH (
        'connector.class' = 'io.debezium.connector.mysql.MySqlConnector',
        'database.hostname' = 'mysql',
        'database.port' = '3306',
        'database.user' = 'root',
        'database.password' = 'mypassword',
        'database.allowPublicKeyRetrieval' = 'true',
        'database.server.id' = '184055',
        'database.server.name' = 'db',
        'database.whitelist' = 'mydb',
        'database.history.kafka.bootstrap.servers' = 'kafka:9092',
        'database.history.kafka.topic' = 'mydb',
        'table.whitelist' = 'mydb.user',
        'include.schema.changes' = 'false',
        'transforms'= 'unwrap,extractkey',
        'transforms.unwrap.type'= 'io.debezium.transforms.ExtractNewRecordState',
        'transforms.extractkey.type'= 'org.apache.kafka.connect.transforms.ExtractField$Key',
        'transforms.extractkey.field'= 'id',
        'key.converter'= 'org.apache.kafka.connect.converters.IntegerConverter',
        'value.converter'= 'io.confluent.connect.avro.AvroConverter',
        'value.converter.schema.registry.url'= 'http://schema-registry:8081'
    );
    

    and create your stream simply !

    This video can help you a lot

    https://www.youtube.com/watch?v=2fUOi9wJPhk&t=1550s