I'm using --num-executor with EMR spark-submit but the conf is not getting honored. The job runs with multiple executors even if I set it --num-executor 1
. I tried it with different EC2 instance type and the number of executors it runs by default seems to vary by instance types.
I guess dynamic allocation is active. You must inactive this parameter on spark configurations or send in spark-submit line like this:
spark-submit --conf spark.dynamicAllocation.enabled=false --class ...