Search code examples
apache-sparksbtsbt-assembly

sbt switch dependencies for runtime


I am developing a spark application which is using xgboost4j. https://github.com/dmlc/xgboost/tree/master/jvm-packages

This package requires to be compiled to the local architecture due to local C dependencies of the jar. But the cluster has a different architecture than the development laptop. How can I substitute the package when running sbt assembly via one from the cluster? Or would you suggest to solve this via a % "provided" ?


Solution

  • Use suffix for (provided/compile) libs as like:

    val suffix = Option(System getProperty "provided").isDefined match {
        case true  => "provided"
        case false => "compile"
      }
    
    libraryDependencies += "org.apache.spark" %% "spark-sql" % Spark.version % suffix
    

    and run sbt -Dprovided assembly if you need all jars in your uberjar