Search code examples
apache-sparkjarspark-launcher

Pass parameters to the jar when using spark launcher


I am trying to create an executable jar which is using a spark launcher to run another jar with data transformation task(this jar creates spark session).

I need to pass java parameters(some java arrays) to the jar which is executed by the launcher.

object launcher {
  @throws[Exception]
  // How do I pass parameters to spark_job_with_spark_session.jar
  def main(args: Array[String]): Unit = {
    val handle = new SparkLauncher()
      .setAppResource("spark_job_with_spark_session.jar")
      .setVerbose(true)
      .setMaster("local[*]")
      .setConf(SparkLauncher.DRIVER_MEMORY, "4g")
      .launch()
  }
}

How can I do that?


Solution

  • need to pass java parameters(some java arrays)

    It is equivalent to executing spark-submit so you cannot pass Java objects directly. Use app args

    addAppArgs(String... args)
    

    to pass application arguments, and parse them in your app.