Search code examples
pandasdatetimepysparkdate-range

How to replace the Timedelta Pandas function with a pure PySpark function?


I am developing a small script in PySpark that generates a date sequence (36 months before today's date) and (while applying a truncate to be the first day of the month). Overall I succeeded this task however

But with the help of the Pandas package Timedelta to calculate the time delta .

Is there a way to replace this Timedelta from Pandas with a pure PySpark function ?

import pandas as pd
from datetime import date, timedelta, datetime
from pyspark.sql.functions import col, date_trunc

today = datetime.today()
data = [((date(today.year, today.month, 1) - pd.Timedelta(36,'M')),date(today.year, today.month, 1))] # I want to replace this Pandas function
df = spark.createDataFrame(data, ["minDate", "maxDate"])

+----------+----------+
|   minDate|   maxDate|
+----------+----------+
|2016-10-01|2019-10-01|
+----------+----------+

import pyspark.sql.functions as f

df = df.withColumn("monthsDiff", f.months_between("maxDate", "minDate"))\
    .withColumn("repeat", f.expr("split(repeat(',', monthsDiff), ',')"))\
    .select("*", f.posexplode("repeat").alias("date", "val"))\ #
    .withColumn("date", f.expr("add_months(minDate, date)"))\
    .select('date')\
    .show(n=50)

+----------+
|      date|
+----------+
|2016-10-01|
|2016-11-01|
|2016-12-01|
|2017-01-01|
|2017-02-01|
|2017-03-01|
 etc...
+----------+

Solution

  • You can use Pyspark inbuilt trunc function.

    pyspark.sql.functions.trunc(date, format) Returns date truncated to the unit specified by the format.

    Parameters:
    format – ‘year’, ‘YYYY’, ‘yy’ or ‘month’, ‘mon’, ‘mm’
    

    Imagine I have a below dataframe.

    list = [(1,),]
    df=spark.createDataFrame(list, ['id'])
    
    import pyspark.sql.functions as f
    
    df=df.withColumn("start_date" ,f.add_months(f.trunc(f.current_date(),"month") ,-36))
    df=df.withColumn("max_date" ,f.trunc(f.current_date(),"month"))
    
    >>> df.show()
    +---+----------+----------+
    | id|start_date|  max_date|
    +---+----------+----------+
    |  1|2016-10-01|2019-10-01|
    +---+----------+----------+
    

    Here's a link with more details on Spark date functions.

    Pyspark date Functions