I have been at this for several days now, my objective is simple.
I am setting up a SparkConf()
object inside a Java Application and I need to specify a custom path to the log4j.properties file. The application is meant to run on a Spark Worker which has the custom log4j.properties
file required.
It seems like my Spark configuration is unable to find this and is using the default file.
I have added the log4j.properties file in several places inside the worker pod like :/app/spark/conf/log4j.properties
. But it doesn't seem to work.
Here's how I'm trying to set the custom path:
SparkConf sc = new SparkConf().setMaster(master)
.set("spark.driver.extraJavaOptions", "-Dlog4j.configuration=/app/spark/conf/log4j.properties")
.set("spark.executor.extraJavaOptions", "-Dlog4j.configuration=/app/spark/conf/log4j.properties")
The last two statements are currently having no effect on the Spark Configuration. Any idea what's wrong with this? Is something missing on my end?
Help...
Seems like you are giving the path without the keyword "file" in it. Just add the following and it should be able to refer to your file if it exists on the path...
SparkConf sc = new SparkConf().setMaster(master)
.set("spark.driver.extraJavaOptions", "-Dlog4j.configuration=file:/app/spark/conf/log4j.properties")
.set("spark.executor.extraJavaOptions", "-Dlog4j.configuration=file:/app/spark/conf/log4j.properties")