I am using Spark 2.0.0 and I have web based RStudio through which I am using SparkR package.
While running a large program if I have to kill a job during process, How can I do that?
STOP button in R doesn't work and If I kill the session itself then all the objects created in that session also get removed.
What is the best way to do it?
Since R probably blocks because it is waiting for a response from Spark, the most suitable way might be to access the WebUI (if it is accessible of course) and kill the current Stage.
Open up the master WebUI (default port is 8080) and click on SparkR, which is the application name.
Now you are in the SparkR application UI. Click on Stages and kill the active stage by pressing on (kill). This of course doesn't kill everything, only the active stage and other stages might also needed to be killed.