I am trying to check if a Dataframe has data inside:
The code is:
df = spark.sql("SELECT * FROM pts_dev.data_quality.rpa where family = '{0}' and mm_aaaa_ref = '{1}'".format(family,mm_aaaa))
display(df)
if df is not None:
print("no empty")
else:
print("empty")
The df does not have any result (is empty), but the message I got is not empty. I think my error is when I define df, I have tried if (len)df is not None: but it says that Dataframe has no len()
Could you help me?
Thanks in advance
You can do that simply by df.isEmpty() Ref:https://spark.apache.org/docs/latest/api/python/reference/pyspark.sql/api/pyspark.sql.DataFrame.isEmpty.html#pyspark-sql-dataframe-isempty