Search code examples
javamavenquarkusquarkus-rest-client

Looking for the best way to share an interface between microservices with quarkus


I'm still quite new to microservices and have a few basic architectural questions that I can't get solved right now. I'm using the Quarkus framework with the standard extensions like quarkus-resteasy and quarkus-rest-client for the realization.

The scenario:

I have an example of a "Persistence" service that I want to externally populate with data via a REST call in a dedicated Maven project.

@Path("/api/persistence")
@Products(MediaType.APPLICATION_JSON)
public class Persistence{

    @Inject
    EntityManager entityManager;

    @POST
    @Transactional
    public Response create(PostDto postDto) {
        Post post = toPostMapper.toResource(postDto);
        entityManager.persist(post);
        return Response.ok(postDto).status(201).build();
    }
}

At the same time I would like to have a microservice DataGenerator which generates the corresponding data and passes it to the Persistence Service.

My problem : API sharing

Both services were created as Maven projects. According to the tutorials I found the correct way would be to declare an interface (here called PersistenceApi) in the DataGenerator project like this

@Path("/api/persistence")
@Products(MediaType.APPLICATION_JSON)
@RegisterRestClient
public interface PersistenceApi {

    @POST
    @Transactional
    public Response create(PostDto post) ;
    
}

This interface is then integrated into the DataGenerator service via @Inject, which leads to the following exemplary service.

@RequestScoped
@Path("/api/datagenerator")
@Products("application/json")
@Consumes("application/json")
public class DataGenerator{

    @Inject
    @RestClient
    PersistenceApi persistenceApi 
    
    @POST
    public void getPostExamplePostToPersistence() {
        PostDto post = new PostDto();
        post.setTitle("Find me in db in persistence-service")
        persistenceApi.create(post);
    }
}

I have the PersistenceService running locally on port 8181 and have added the following entry in the application.properties of the DataGenerator project so that the service can be found.

furnace.collection.item.service.PersistenceApi/mp-rest/url=http://localhost:8181
furnace.collection.item.service.PersistenceApi/mp-rest/scope=javax.inject.Singleton

I find it "wrong" to declare the interface in my DataGenerator, because at this point I don't notice when the api provided by the Persistence service changes. Accordingly one could come up with the idea to position the interface in the Persistence service, which is then implemented by my concrete Persistence implementation and leads to the following code.

@Path("/api/persistence")
@Products(MediaType.APPLICATION_JSON)
@RegisterRestClient
public class PersistenceApiImpl implements PersistenceApi {

    @Inject
    EntityManager entityManager;

    @POST
    @Transactional
    public Response create(PostDto fruit) {
        Post post = toPostMapper.toResource(fruit);
        entityManager.persist(post);
        return Response.ok(fruit).status(201).build();
    }

}

In order to use them in my DataGenerator project, I would have to include the Persistence project as a dependency in my DataGenerator project, which sounds like a "monolith with extra steps" to me and therefore feels wrong in terms of "separation of concerns".

I have tried the following approach: I created another Maven project called PersistenceApi which only contains the corresponding PersistenceApi. This PersistenceApi project was then included as a dependency in both the "Persistence" and "DataGenerator" projects. In the "Persistence"-Project I implement the service from the example above and try to address the corresponding interface in the "DataGenerator"-Project via @Inject.

Unfortunately this does not work. When I'm building the service, I get the message that the required dependency PersistenceApi, which I want to include via @Inject in the DataGenerator service, cannot be injected in the form of an UnsatisfiedResolutionException.

Now my questions:

  1. I don't see what I'm missing here. Could you help me?
  2. Is this kind of API-sharing with dedicated Api projects a viable way or is the "monolith with extra steps" approach really the way to go?

Thank you in advance.


Solution

  • Thats a common problem with microservices. Like in the book "Microservices: Grundlagen flexibler Softwarearchitekturen" by Eberhard Wolff (I saw that you are German too) i follow the idea that microservices should have the same coupling like the teams developing them and like the organization your developing it for(have a look at Conway's law). Therefore services of mostly independent teams should be developed independly and the api changes of one service should not affect another at the time of the update.

    If you develop both services in your team then i think you can couple them the way you are doing it because you dont have to work together with other teams and there will be no huge overhead. Note that you will be forced to release both services together. If that is always ok for you then save your time and do it your way, if not have a look at API-Versioning:

    I use api versioning so the old api is still reachable under "v1/" and the new one under "v2/". This way the team behind the other microservice has enough time to update their service.

    Have a look at Domain-driven Design for different ways of integrating bounded contexts (=services) and the coupling consequences. Without API-Versioning you are forced to a partnership and you need to release together. Maybe you prefer Customer-Supplier or even conformist.

    To test compatibility between both services have a look at consumer driven contracts and Pact. You can also generate open api files and track their changes but that will only help to notify people about changes.