Search code examples
apache-kafkaapache-kafka-streamsapache-kafka-connectconfluent-platform

Kafka producer code will be handled by which team when an event is generated


I have a basic knowledge of kafka topic/producer/consumer and broker.

I would like to understand how this works in realworld.

For example Consider below usecase .

  1. User interacts with Web application
  2. when user clicks on something an event is generated
  3. So there will be one kafka producer running which writes messages to atopic when an event is generated
  4. Then Consumer(for Ex: spark application reads from topic and process the data)

Whose responsibility it is to take care of the producer code? A frond end java/Web developer's? Because web developers are familiar with events and tomcat server logs.

Can anyone explain interms of developer and responsibility of taking care of each section.


Solution

  • In a "standard" scenario, following people/roles are involved:

    1. Infrastructure Dev: Setup Kafka Instance (f.e. openshift/strimzi)
      • manage topics, users
    2. Frontend Dev: Creating the frontend (f.e. react)
    3. Backend Dev: Implementing backendsystem (f.e. asp .net core)
      • handle DB Connections, logging, monitoring, IAM, business logic, handling Events, Produce kafka Events, ...)
    4. App Dev anyone writing or managing the "other apps" (f.e.spark application). Consumes (commit) the kafka Events

    Since there are plenty implementations of the producer/consumer kafka API it's kind of language agnostic, (see some libs). But you are right the dev implementing the features regarding kafka should at least be familiar with pub-sub.

    Be aware we are talking about roles, so there are not necessarily four people involved, it could also just be one person doing the whole job. Also this is just a generic real world scenario and can be completely different in your specific usecase/environment.