Search code examples
javaapache-sparkspark-submit

How do I submit args while using Spark Submit?


I need help in using spark-submit please.

 ./bin/spark-submit --master spark://127.0.0.1:7077 --class Main --jars jars/test.jar firstArgs secondArgs

This is my spark command to start spark-submit and run my test.jar Main method.

My test.jar is java file with sample code like

public class Main {

public static void main(String[] args) {
 System.out.println("Hello World, Im testing my patience");
 System.out.println("TEST ! : " + args[0]);
 System.out.println("TESTEST ! : " + args[1]);
}

}

this.

so my ideal result should be

TEST ! : firstArgs TESTEST ! : secondArgs

but, instead i get

 DependencyUtils: Local jar /home/me/server/spark-3.1.1-bin-hadoop3.2/firstArgs

It seems like when I type my application arguments as

~~ --jars jars/test.jar firstArg secondArg

the spark-submit try to take the firstArg as a jar file instead of as an argument :(

How can I fix my command line to make it disgest my firstArg and secondArg properly ??

I've tried seeing the posts on how to use --application-arguments but all the example codes do not look any different than mine.. (At least in my eyes) I cant seem to find what's the problem in my command line. Please help me :(


Solution

  • If you go by this https://spark.apache.org/docs/latest/submitting-applications.html

    Last argument to spark-submit is jar (without --jars) and then everything onward is the argument to your program.

    Try:

    ./bin/spark-submit --master spark://127.0.0.1:7077 --class Main jars/test.jar firstArgs secondArgs