I have a dataframe which contains null values:
from pyspark.sql import functions as F
df = spark.createDataFrame(
[(125, '2012-10-10', 'tv'),
(20, '2012-10-10', 'phone'),
(40, '2012-10-10', 'tv'),
(None, '2012-10-10', 'tv')],
["Sales", "date", "product"]
)
I need to count the Non Null values in the "Sales" column.
I tried 3 methods.
The first one I got it right:
df.where(F.col("sales").isNotNull()).groupBy('product')\
.agg((F.count(F.col("Sales")).alias("sales_count"))).show()
# product | sales_count
# phone | 1
# tv | 2
The second one, it's not correct:
df.groupBy('product')\
.agg((F.count(F.col("Sales").isNotNull()).alias("sales_count"))).show()
# product | sales_count
# phone | 1
# tv | 3
The third one, I got the error:
df.groupBy('product')\
.agg((F.col("Sales").isNotNull().count()).alias("sales_count")).show()
TypeError: 'Column' object is not callable
What might cause errors in the second and third methods?
The first attempt of yours is filtering out the rows with null
in Sales
column before you did the aggregation. Thus it is giving you the correct result.
But with the second code
df.groupBy('product') \
.agg((F.count(F.col("Sales").isNotNull()).alias("sales_count"))).show()
You haven't filtered
out and did aggregation
on whole dataset. If you analyze closely F.col("Sales").isNotNull()
would give you boolean columns i.e. true
and false
. So F.count(F.col("Sales").isNotNull())
is just counting the boolean values in the grouped dataset which is evident if you create a new column as below.
df.withColumn("isNotNull", F.col("Sales").isNotNull()).show()
which would give you
+-----+----------+-------+---------+
|Sales| date|product|isNotNull|
+-----+----------+-------+---------+
| 125|2012-10-10| tv| true|
| 20|2012-10-10| phone| true|
| 40|2012-10-10| tv| true|
| null|2012-10-10| tv| false|
+-----+----------+-------+---------+
So the counts are correct with your second attempt.
For your third attempt, .count()
is an action which cannot be used in aggregation transformation. Only functions returning Column
dataType can be used in .agg()
and they can be inbuilt functions, UDFs or your own functions.