Search code examples
pysparkapache-spark-sqlsqldatetime

Pyspark sql add letter in datetype value


I have epoch time values in Spark dataframe like 1569872588019 and I'm using pyspark sql in jupyter notebook.

I'm using the from_unixtime method to convert it to date.

Here is my code:

SELECT from_unixtime(dataepochvalues/1000,'yyyy-MM-dd%%HH:MM:ss') AS date FROM testdata

The result is like: 2019-04-30%%11:09:11

But what I want is like: 2019-04-30T11:04:48.366Z

I tried to add T and Z instead of %% in date but failed.

How can I insert T and Z letter?


Solution

  • You can specify those letters using single quotes. For your desired output, use the following date and time pattern:

    "yyyy-MM-dd'T'HH:mm:ss.SSS'Z'"
    

    Using your example:

    spark.sql(
        """SELECT from_unixtime(1569872588019/1000,"yyyy-MM-dd'T'HH:MM:ss'Z'") AS date"""
    ).show()
    #+--------------------+
    #|                date|
    #+--------------------+
    #|2019-09-30T14:09:08Z|
    #+--------------------+