Search code examples
apache-sparkpysparkazure-synapse

Azure Synapse - How to catch SparkException


I tried below things:

import org.apache.spark.SparkException
from org.apache.spark.SparkException import SparkException
from org.apache.spark import SparkException

These all give ModuleNotFoundError: No module named 'org.apache.spark.SparkException'.

I need to handle PySpark exceptions in Azure Synapse like below:

except (Py4JJavaError, SparkException, TypeError) as e:
   print(e)

Solution

  • org.apache.spark.SparkException is the scala exception thrown in JVM process, you can't and don't need to handle this in pyspark