Search code examples
spring-batchcloud-foundryspring-cloud-dataflow

Spring Cloud Dataflow Task properties are not refreshed between executions


I'm running Spring Cloud Dataflow in Pivotal Cloud Foundry, using the Cloud Foundry server jar file. I have a task application which is a Spring Batch job, and I'm using @ConfigurationProperties to configure parameters for the batch job.

The first time I launch the task, I pass in properties such as:

task launch --name job-truncate --properties "app.job-truncate.truncate.location-id=1, app.job-truncate.truncate.week-end-date=2018-02-01"

The batch job completes successfully.

If I launch the same task again, but with a different property, such as:

task launch --name job-truncate --properties "app.job-truncate.truncate.location-id=2, app.job-truncate.truncate.week-end-date=2018-01-31"

The job is executed, but the app keeps the properties from the previous run. I can inspect the cloud foundry console and verify that the app's user provided env variables are the same. I did some debugging on the server app, and it looks like the server app looks for an app with the same name, then issues a rest api call to start up the same app, disregarding any changed properties.

Is that the intended behavior? If so, what would be the best way to make sure properties are refreshed on each run.


Solution

  • For Tasks run on CloudFoundry, we do not modify the properties between executions. The idea is that those are set as environment variables on the droplet and don't change. We do modify the command line arguments between executions so if you wanted to override the properties, you can use command line args to do so.