The Problem: failed to execute spring batch job using CommandLineJobRunner, where the application defines its own data source and Hibernate configuration.
Error Message (extracted) DatabaseLookup org.springframework.boot.autoconfigure.orm.jpa.DatabaseLookup getDatabase org.springframework.jdbc.support.MetaDataAccessException: Could not get Connection for extracting meta-data; nested exception is org.springframework.jdbc.CannotGetJdbcConnectionException: Failed to obtain JDBC Connection; nested exception is org.apache.commons.dbcp.SQLNestedException: Cannot create PoolableConnectionFactory ... Caused by: org.hibernate.HibernateException: Access to DialectResolutionInfo cannot be null when 'hibernate.dialect' not set
A bit about the batch job: - SCDF is run using docker-compose.yml downloaded from spring web site. - a number of properties files under /config, which are built into jar, including a Hibernate configuration file defining "hibernate.dialect=org.hibernate.dialect.MySQLDialect" - the application defines its own data source using properties below
qre.data.driverClassName=org.mariadb.jdbc.Driver
qre.data.url=jdbc:mysql://127.0.0.1:3306/dataflow
qre.data.username=root
qre.data.password=rootpw
jar file is built using spring-boot-maven-plugin, defining org.springframework.batch.core.launch.support.CommandLineJobRunner as mainClass
org.springframework.boot spring-boot-maven-plugin a.b.c.MyCommandLineJobRunner
MyCommandLineJobRunner extends Spring CommandLineJobRunner and pass job name and configuration as name/value pair
job.name=MYJOB
run the jar successfully on local "java -jar application.jar job.name=MYJOB"
Tried to search SCDF reference guide, but unable to find anything useful yet. Any help is apprecaited.
I am not sure why your application tries to override the hibernate dialect property as the batch application still needs to use the SCDF's data source. You can override the hibernate dialect for the SCDF server using the property spring.jpa.properties.hibernate.dialect
. You can see some of these examples here in the documentation.