Search code examples
javaspringoracle-databasespring-batchspring-cloud-dataflow

How to solve the execption "java.lang.IllegalArgumentException: Invalid TaskExecution, ID 3" when launching a task from SCDF?


I'm trying to run a Spring batch jar through SCDF. I use different datasource fpr both reading and writing(Both Oracle DB). The dataSource I use to write is primary datasource. I use a Custom Build SCDF to include oracle driver dependencies. Below is the custom SCDF project location.

dataflow-server-22x

I my local Spring batch project I implemented DefaultTaskConfigurer to provide my primary datasource. So when I run the Batch project from IDE the project runs fine and it reads records from secondary datasource and writes into primary datasource. But when I deploy the batch jar to custom build SCDF as task and launch it, I get an error that says,

org.springframework.context.ApplicationContextException: Failed to start bean 'taskLifecycleListener'; nested exception is java.lang.IllegalArgumentException: Invalid TaskExecution, ID 3 not found

When I checked the task Execution table (which can be accessed via primary datasource), the task execution ID is there in the table. But still I get this error. For each each run a new task Id is inserted into Task_execution table but I get the above error message with newly inserted task_execution id. Below are the project specifics:

Spring-boot-starter-parent : 2.2.5.RELEASE.
Spring-cloud-dataflow : 2.2.0.RELEASE.

I load all of my Batch_jobs from main class of Boot using the instance of batch job class and only the main class (which kickstarts all jobs)contains @EnableTask annotation. Below is my class structure.

    @SpringBootApplication
    @EnableScheduling
    @EnableTask
    public class SpringBootMainApplication{
        @Autowired
        Job1Loader job1Loader;

        public static void main(String[] args) {
            SpringApplication.run(SpringBootMainApplication.class, args);
        }

        @Scheduled(cron = "0 */1 * * * ?")
        public void executeJob1Loader() throws Exception
        {
            JobParameters param = new JobParametersBuilder()
                                        .addString("JobID",         
                                     String.valueOf(System.currentTimeMillis()))
                                        .toJobParameters();
            jobLauncher.run(job1Loader.loadJob1(), param);
        }
    }

    //Job Config
    @Configuration
    @EnableBatchProcessing
    public class Job1Loader {
    @Bean
        public Job loadJob1()
        {
            return jobBuilderFactory().get("JOb1Loader")
                .incrementer(new RunIdIncrementer())
                .flow(step01())
                .end()
                .build();;//return job
    }

I use two different datasources in my Spring job project, both are oracle datasource(Different servers).I marked one of them as primary and used that Datasource in my custom implementation of "DefaultTaskConfigurer" as below.

@Configuration
public class TaskConfig extends DefaultTaskConfigurer {
    @Autowired
    DatabaseConfig databaseConfig; 
    @Override
    public DataSource getTaskDataSource() {
        return databaseConfig.dataSource();//dataSource() returns the 
primary ds
    }
}

Below are the properties I use in both SCDF custom serer and Spring Batch project.

UPDATE - 1

**Spring batch Job :**
 spring.datasource.jdbc-url=jdbc:oracle:thin:@**MY_PRIMARY_DB**
 spring.datasource.username=db_user
 spring.datasource.password=db_pwd
 spring.datasource.driver-class-name=oracle.jdbc.OracleDriver

spring.datasource.jdbc-url=jdbc:oracle:thin:@**MY_SECONDARY_DB**
 spring.datasource.username=db_user
 spring.datasource.password=db_pwd
 spring.datasource.driver-class-name=oracle.jdbc.OracleDriver

**SCDF custom Server:**
 spring.datasource.url=jdbc:oracle:thin:@**MY_PRIMARY_DB**
 spring.datasource.username=db_user
 spring.datasource.password=db_pwd
 spring.datasource.driver-class-name=oracle.jdbc.OracleDriver

My Batch application uses two db configurations. one to Read and one write. Because the source and destination are different. Since the TASK_EXECUTION tables were created in MY_PRIMARY_DB database I pass only the primary db configuration for the SCDF to read and write. Because read and write takes place in the same DB.

I tried other answers for this question, But none worked. And as I said earlier, Any input on this would be of great help. Thanks.


Solution

  • Instead of overriding the DefaultTaskConfigurer.getTaskDataSource() method as I have done above, I changed the DefaultTaskConfigurer implementation as below. I'm not sure yet why overriding the method getTaskDataSource() is causing the problem. Below is the solution that worked for me.

    @Configuration
    public class TaskConfig extends DefaultTaskConfigurer 
    {
    
        Logger logger = LoggerFactory.getLogger(TaskConfig.class);
    
        Autowired
        public TaskConfig(@Qualifier("datasource1") DataSource dataSource) {
            super(dataSource); //"datasource1" is reference to the primary datasource.
        }
    }