Search code examples
apache-sparkdatabricksaws-databricks

How can I set spark.task.maxFailures on AWS databricks?


I would like to set spark.task.maxFailures to value more than 4. Using Databricks 6.4 runtime, how can I set this value?

When I execute spark.conf.get("spark.task.maxFailures"), I get below error

java.util.NoSuchElementException: spark.task.maxFailures

Has anyone set this on databricks before?

I understand I can set it up using

spark.conf.set("spark.task.maxFailures", 10)

however not sure whether this has to be set up at cluster start time or can be set after that?


Solution

  • You can update required spark conf set in clusters advanced section .

    enter image description here

    enter image description here