Search code examples
rapache-sparkshinysparklyr

R Shiny and Spark: how to free Spark resources?


Say we have a Shiny app which is deployed on a Shiny Server. We expect that the app will be used several users via their web browser, as usual.

The Shiny app's server.R includes some sparklyr package code which connects to a Spark cluster for classic filter, select, mutate, and arrange operations on data located on HDFS.

Is it mandatory to disconnect from Spark: to include a spark_disconnect at the end of the server.R code to free resources ? I think we should never disconnect at let Spark handle the load for each arriving and leaving user. Can somebody please help me to confirm this ?


Solution

  • TL;DR SparkSession and SparkContext are not lightweight resources which can be started on demand.

    Putting aside all security considerations related to starting Spark session directly from a user-facing application, maintaining SparkSession inside server (starting session on entry, stopping on exit) is simply not a viable option.

    server function will be executed every time there is an upcoming event effectively restarting a whole Spark application, and rendering project unusable. And this only the tip of the iceberg. Since Spark reuses existing sessions (only one context is allowed for a single JVM), multiuser access could lead to random failures if reused session has been stopped from another server call.

    One possible solution is to register onSessionEnded with spark_disconnect, but I am pretty sure it will be useful only in a single user environment.

    Another possible approach is to use global connection, and wrap runApp with function calling spark_disconnect_all on exit:

    runApp <- function() {
      shiny::runApp()
      on.exit({
        spark_disconnect_all()
      })
    }
    

    although in practice resource manager should free resources when driver disassociates, without stopping session explicitly.