Search code examples
apache-kafkamicroservicesgoogle-cloud-pubsubmessaging

Separate producer and consumer microservices


New to event driven architecture. I would like to create a microservice which reads from a Google Cloud Pub/Sub topic, process the message and then put the message on an internal Kafka topic for other services.

My question is should the single microservice handle consuming, processing and producing messages or should I have separate microservices for consuming and processing/producing.

e.g. 1 single reader microservice that is responsible for reading the Google Cloud Pub/Sub topic and puts it on an internal topic. A second processing microservice (that can be scaled up) that reads the internal topic, processes the message and puts it onto a processed topic. I see one of the benefits of this option is that any processing problems would not affect getting the messages from Google Cloud.

Any advice on best practice would be appreciated.


Solution

  • Ideally, you limit processing as much as possible - either process within Pubsub then copy "good data", or create "landing zone topics" for each Pubsub topic in Kafka and process all within Kafka...

    But you can use Kafka Connect for the copy part, which offers minimal message processing features called transforms