Search code examples
pysparkapache-spark-sqldatabricks-sql

How to convert t-sql datefromparts in Databricks sparkSQL without creating a function


I have the following T-SQL code that I would like to convert to sparkSQL

SELECT datefromparts(YEAR(sysdatetime()) - 1, 12, 31)

The problem is there isn't an equivalent 'datefromparts' function in sparkSQL.

Can someone show me how to convert the T-SQL to sparkSQL

The output should be 31/12/2022


Solution

  • Use trunc function for this case.

    select trunc(current_date(),'year')-1 as req_dt
    #req_dt
    #2022-12-31
    

    To format date use date_format function

    select date_format(trunc(current_date(),'year')-1,'MM/dd/yyyy') as req_dt
    
    #req_dt
    #12/31/2022