Search code examples
apache-sparkhadoopspark-streaminghadoop-yarn

Killing spark streaming job running on yarn cluster mode through unix shell on error/interruption


I have shell script which initializing the spark streaming job in Yarn cluster mode.I have scheduled shell script through Autosys. Now when i kill autosys job i would like to kill this spark job running in cluster mode as well.
I have tried using yarn application -kill in shell script on error return code but its not getting executed.
but
I am able to kill this job from another shell window.
yarn application -kill command works perfectly there and kill the application.

Is there any workaround to kill cluster mode job on interruption (automatically ) through same shell ?


Solution

  • In error return code logic -> run yarn application -kill <$appid> as orphan process .