Search code examples
scalajunitsbtsbt-assembly

sbt assembly with Junit test fail


I am very new to scala and sbt.
I wanted to run Junit tests with sbt assembly.
I designed all my test an all run correctly with IntelliJ. When i try to build with tests, it always fails giving lots of errors.

Here is my build.sbt

name := "updater"
version := "0.1-SNAPSHOT"
scalaVersion := "2.11.12"

val sparkVersion = "2.4.0"

    libraryDependencies ++= Seq(

          //"org.scala-lang" % "scala-reflect" % "2.11.12",
          "org.apache.spark" %% "spark-core" % sparkVersion % Provided,
          "org.apache.spark" %% "spark-sql" % sparkVersion % Provided,
          "com.typesafe" % "config" % "1.3.4",


          //Testing
          "junit" % "junit" % "4.10" % Test,
          "com.novocode" % "junit-interface" % "0.11" % Test
          //  exclude("junit", "junit-dep")
          ,
          //"org.scalatest" %% "scalatest" % "3.0.7" % Test,
          "org.easymock" % "easymock" % "4.0.2" % Test,


          //Logging
          "ch.qos.logback" % "logback-classic" % "1.2.3",
          "com.typesafe.scala-logging" %% "scala-logging" % "3.9.0"
        )

        assemblyMergeStrategy in assembly := {
          case PathList("src/test/resources/library.properties", xs@_*) => MergeStrategy.discard
          case PathList("META-INF", xs@_*) => MergeStrategy.discard
          case x => MergeStrategy.first
        } 

I attach you the log file as the problem, to me, as a newby seems not understandable. It is driving me crazy.

This is my abstract Test class which is supposed to initialize a spark context with @BeforeClass in every test class. I only included this because I suspect it could be the cause of the failure.

Do you have any suggestions on how to solve it?
Thanks


Solution

  • I was instanciating a class like so:

     abstract class SparkTest {
    
      val spark: SparkSession = SparkTest.spark
    
    }
    
        object SparkTest {
          var spark: SparkSession = _
    
          @BeforeClass
          def initializeSpark(): Unit = {
            spark = SparkSession
              .builder()
              .appName("TableUpdaterTest")
              .master("local")
              .getOrCreate()
          }
    
          @AfterClass
          def stopSpark(): Unit = {
                spark.stop()
          }
        }
    

    Apparently by commenting the spark.stop() everything started to work.
    Anyone has an Idea on why?