How to create a fat jar with specific dependencies. I have spark project which need 2 external jar which I wanted to add in application jar. when I am creating executable jar then no dependency is included in jar and when I create fat jar all the dependencies are getting added including spark etc.. I wanted to add only those 2 jars in my jar. below is the pom file I created using maven assembly plugin.
<dependencies>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-core_2.11</artifactId>
<version>2.2.0</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-sql_2.11</artifactId>
<version>2.2.0</version>
</dependency>
<!-- Below dependencies need to be added in Application jar -->
<dependency>
<groupId>netacuity</groupId>
<artifactId>common-netacuity-db</artifactId>
<version>3.1.2</version>
</dependency>
<dependency>
<groupId>netacuity</groupId>
<artifactId>common</artifactId>
<version>2.1.1</version>
</dependency>
</dependencies>
<build>
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-assembly-plugin</artifactId>
<version>2.4.1</version>
<configuration>
<!-- get all project dependencies -->
<descriptorRefs>
<descriptorRef>jar-with-dependencies</descriptorRef>
</descriptorRefs>
<!-- MainClass in mainfest make a executable jar -->
<archive>
<manifest>
<mainClass>com....App</mainClass>
</manifest>
</archive>
</configuration>
<executions>
<execution>
<id>make-assembly</id>
<!-- bind to the packaging phase -->
<phase>package</phase>
<goals>
<goal>single</goal>
</goals>
</execution>
</executions>
</plugin>
You can use the scope
for this. By default scope
is compile
and so all the jars will be included when you package it.
To include a jar you can either provide scope
as compile
and keep the default
<dependency>
<groupId>netacuity</groupId>
<artifactId>common-netacuity-db</artifactId>
<version>3.1.2</version>
<scope>compile</scope>
</dependency>
To exclude the jar, you can change the scope
to provided
. These jars should be available during runtime.
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-sql_2.11</artifactId>
<version>2.2.0</version>
<scope>provided</scope>
</dependency>