I'm new in Kafka Streams world. I'm wondering when to use Kafka Streams GlobalKTable
(with compacted topic under the hood) instead of regular database for persisting data. And what are advantages and disadvantages of both solution. I guess both ensure data persistence on the same level.
Let's say there is an simple e-commerce app having users registering and updating their data. And there are two microservices - first one (service-users
) is responsible for registering users and the second one (service-orders
) is responsible for placing orders. And now there are two options:
service-user
accepts request, save newly registered user data in it's database (SQL or noSQL, doesn't matter) and then send event to Kafka to propagate this to other services. service-orders
receives such event and store necessary user data in it's own database. It's like a most common pattern (from my experience).and now the second approach with GlobalKTable
:
service-user
accepts request and send event with user data snapshot to Kafka. service-user
and service-orders
use GlobalKTable
to read information about users.When should I use which solution? Which solution is better in which cases? What are advantages and disadvantages of both approaches? Doesn't the second approach breaks the rule 'each microservice should maintain it's own data in it's own database'?
Hope I explained my considerations well and they make sense at all.
In general the adventages of GlobalKTable are:
And the main disadvantages are: