Search code examples
cqrsevent-sourcingevent-driven-design

CQRS and Event Sourcing - Read your own events


Currently, I'm implementing Event Driven Architecture and have a service for command (write part) and another service for query (read part).

What I'm doing right now.

  1. Accept a command on CommandService
  2. Store event and publish event on an event bus
  3. ReadService listens to these events and update read models

This sounds good if you listen to your own events. Let's say I listen to external event from CommandService

  1. Listen to event
  2. Process a command for this event
  3. Store the event that your domain generated in your event store and publish this event to event bus
  4. ReadService listens to these events and update read models

In this approach, I can see that there is double latency to update my read models. First latency -> CommandService time pull the event 2nd latency -> ReadService time to pull the event generated from CommandService.

I'm thinking If I update my ReadService to listen to CommandService eventstore directly without the need of event bus, then I can reduce one of this latency.

What do you think?


Solution

  • We did something similar some time ago. Basically, we've had

    1. A service implementing process manager pattern and doing some orchestration logic between multiple services using EventSourcing and Saga. Each event stored in the database contains an EventPayload serialized as Avro format with the state of the process at the point in time when event occurred. That payload contains all the state, not just the differences, so we've been updating that payload during processing.
    2. We've used Kafka Connect JDBC Source Connector to read from the EventStore and basically as soon as a new event was published in the EventStore, the Connector read the event and publish it to Kafka (Kafka was used as an EventBus).
    3. The read model that was placed in another service was updated through Kafka (here are two approaches: using Kafka Connect JDBC Sink Connector or using Kafka Consumer and doing transactionally updates from the Consumer).

    I hope it will help you!