So while developing spark programs, I use my local machine and hence have to setMaster to "local". However, when I submit the jar built from my locally developed program, I want to obviously not use "local" mode.
How can I make use of perhaps typesafeconfig to set "local" when testing and "yarn-cluster" when in production?
EDIT:
Based on the solution from @Shaido, for IDEA Intellij:
Go to: Run->edir confirgurations->Under application configuration set:
VM options = -Dspark.master=local[*]
If you are using an IDE, then you do not need to hardcode setMaster
into the code.
For Eclipse, you can go to "Run configurations" -> "Arguments" -> "VM arguments" and add
-Dspark.master=local[*]
This will use all available cores when running locally. Other IDEs should have similar configurations. This way, there is no need to add anything to the code itself.
When running on the cluster, use:
spark-submit --master yarn --deploy-mode cluster