I've build Spark 2.1 source code successfully.
However, when I run some of the examples (e.g., org.apache.spark.examples.mllib.BinaryClassification
), I get the following error.
Exception in thread "main" java.lang.NoClassDefFoundError: scopt/OptionParser
I tried to run those examples using Spark 2.1 pre-built version (examples/jars/spark-examples_2.11-2.1.0.jar
), and I got the same error. Spark 1.6 pre-built version works (lib/spark-examples-1.6.2-hadoop2.6.0.jar
). There are posts related to this error, but they don't seem to be applicable because Spark examples
folder does not have any .sbt
file.
I found the answer. To avoid the error, scopt_x.xx-x.x.x.jar
should also be submitted using --jars
. When you build Spark examples, in addition to spark-examples_x.xx-x.x.x.jar
, scopt_x.xx-x.x.x.jar
will be built too (in my case in the same target folder examples/target/scala-2.11/jars
).
Once you have the jar file, you can submit it with your applications:
./bin/spark-submit \
--jars examples/target/scala-2.11/jars/scopt_x.xx-x.x.x.jar \
--class org.apache.spark.examples.mllib.BinaryClassification \
--master ...