Search code examples
pythonapache-sparksigkill

How to kill apache spark application running in background after killing it from SPARK web UI


Code below is successfully created spark context when I submit using spark submit and running fine.

When I kill application under Running Applications from Apache spark web UI, application state shows killed but, printing Test application on screen after killing also:

Application running on apache spark web UI:

enter image description here

Application killed using "kill" button on spark web UI

enter image description here

Still printing message on screen after killing application

enter image description here

Need solution to auto kill python job when I kill spark-context

from pyspark import SparkConf
from pyspark import SparkContext

if __name__ == "__main__":
    conf = SparkConf().setAppName("TEST")
    conf.set("spark.scheduler.mode", "FAIR")
    sc = SparkContext(conf=conf)

    while True:
        print("Test application")

Solution

  • I found a way to solve my issue with below code. Thanks for all your responses

    from pyspark import SparkConf
    from pyspark import SparkContext
    
    if __name__ == "__main__":
        conf = SparkConf().setAppName("TEST")
        conf.set("spark.scheduler.mode", "FAIR")
        sc = SparkContext(conf=conf)
    
        while True:
            if sc._jsc.sc().isStopped():
                break
            print("Test application")