I am deploying the spring cloud dataflow server using docker. I have created a data processing pipeline inside the dataflow server by deploying couple of spring boot application as source, processor and sink. In order to access the log of each service, I have to either tail it from inside the docker continer ( bash ) or I have to copy it from docker container to the local disk.
I want to push these logs to kafka using log4j-kafka appender for later analysis. I am already doing this for other services running outside the spring cloud dataflow. Is there a way to manage logs of services running inside the spring cloud dataflow using log4j ?
Spring Cloud Stream and Spring Cloud Task apps are standalone Spring Boot applications. This SO thread has some insights into the addition of relevant libraries to consistently publish logs from Spring Boot applications to Kafka.
If you were to do this addition to the OOTB apps, too, please check out the patching procedure described in the reference guide.