Search code examples
apache-sparkhadoop-yarnpyspark

How to kill a running Spark application?


I have a running Spark application where it occupies all the cores where my other applications won't be allocated any resource.

I did some quick research and people suggested using YARN kill or /bin/spark-class to kill the command. However, I am using CDH version and /bin/spark-class doesn't even exist at all, YARN kill application doesn't work either.

enter image description here

Can anyone with me with this?


Solution

    • copy paste the application Id from the spark scheduler, for instance application_1428487296152_25597
    • connect to the server that have launch the job
    • yarn application -kill application_1428487296152_25597