Search code examples
apache-sparkhadoop2

What is difference between spark.jars and spark.driver.extraClassPath


I'm trying to run spark program using spark-submit in yarn-client mode and getting classNotFound Exception. So my question is in which parameter should I pass my jar(--jars or --driver-class-path).

Spark =2.0.0 HDP 2.5 Hadoop= 2.7.3


Solution

  • Use --jars if you want to make these jars available to both driver and executor class-paths. If the required jar is only to be used by driver code, use option --driver-class-path