Search code examples
apache-sparkjupyteramazon-emrlivy

Increasing Spark application timeout in Jupyter/Livy


I'm using a shared EMR cluster with Jupyterhub installed. If my cluster is under heavy load, I get an error enter image description hereHow do I increase the timeout for a spark application from 60 seconds to something greater like 900 seconds (15 mins)?


Solution

  • I've found the correct file to adjust the timeout.

    /etc/jupyter/conf/config.json

    "livy_session_startup_timeout_seconds": 900

    Now the timeout is set to 900 seconds vs 60 before.