Search code examples
scalasbt

how to cross compile for spark2 and spark3 with sbt?


In my program, I need to develop programs for both Spark2 and Spark3 simultaneously. So I need to compile twice.Current approach is to use a shell script, just like:

./gradlew package -Pscala=2.11.8 -Pspark=2.3.2
./gradlew package -Pscala=2.12.8 -Pspark=3.3.1 

I want to migrate to SBT. Can it be completed through cross compilation


Solution

  • SBT's cross compilation is primarily meant for cross compiling against multiple Scala versions.

    If each version of Spark you want to compile with relates to a single Scala version, you could use SBT cross compilation for that as a kind of workaround.


    A better solution is probably to use SBT features of "multi modules" and have a module for each Spark version (optionally cross compiled to multiple Scala versions) sharing all (or some) source code + different libraries versions if needed.

    There are many ways to implement this, if you go this way and encounter issues, post a specific question here and we'll help.

    There's also the sbt-project-matrix plugin which can simplify a bit the submodules for you and also handle cross compilation but in a different way than the SBT default one.