It appears I'm missing something in the way Zeppelin reads interpreter specific configuration.
For example I set spark.cores.max
to 12 in zeppelin-env.sh
and in the spark-defaults.sh
in $SPARK_HOME/conf
but starting the Spark interpreter was starting a Spark Application with only 4 cores.
Then I changed that property in the interpreter UI of Zeppelin and it worked.
zeppelin-env.sh
or zeppelin-site.xml
?There is a hierarchy here:
zeppelin-env.sh
;zeppelin-env.sh
takes precedence over what is specified in spark-defaults.sh
; and,spark-defaults.sh
.There is an important duality here, with respect to what one would expect with any spark application:
spark-submit
;spark-submit
take precedence over those specified in spark-defaults.sh
; and,spark-defaults.sh
.So what you are observing is to be expected, although I too find it confusing (and not particularly well documented anywhere).