Whenever I am adding a module-info.java in my multi-module project I cannot import my Spark dependencies - everything else seems to be working
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-core_2.12</artifactId>
<version>3.0.0-preview2</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-sql_2.12</artifactId>
<version>3.0.0-preview2</version>
</dependency>
IntelliJ tries to readd Maven Dependency without any result.
My module-info looks like:
module common {
exports [...]
requires lombok;
requires spring.data.jpa;
requires spring.data.commons;
requires org.apache.commons.lang3;
requires spring.context;
requires spring.web;
requires spring.security.core;
requires com.google.common;
requires org.json;
requires spring.core;
requires spring.beans;
requires com.fasterxml.jackson.core;
requires com.fasterxml.jackson.databind;
requires spring.jcl;
requires spring.webmvc;
requires mongo.java.driver;
requires org.hibernate.orm.core;
requires com.fasterxml.jackson.dataformat.csv;
requires java.sql;
}
It is not possible to add org.apache.* in my module-info.java either.
Is it possible that Spark is not ready for Jigsaw modules and Java 9+?
Is it possible that spark is not ready for Jigsaw modules and Java 9+?
It does hold true for spark
. Two straight reasons that I can vouch for are:
They do not have an entry for
Automatic-Module-Name: <module-name>
in the artifact's MANIFEST.MF
file.
If you try describing their artifacts using the jar
tool
jar --describe-module --file=<complete-path>/spark-core_2.12-3.0.0-preview2.jar
This would fail to derive the module descriptor for a similar reason as mentioned in this answer.
Few resources that might be useful once you reach here: