Search code examples
scalaapache-sparkunix-timestamp

Spark sql Unix timestamp returning wrong output


The below query is returning 2017-02-23 00:45:00 instead of 12:45.

spark.sql("select from_unixtime(unix_timestamp(('2017-02-23 12:45:00')," +
          "'yyyy-MM-dd hh:mm:ss'))").show(false)

But the below query is returning expected output

2017-02-23 13:45:00

spark.sql("select from_unixtime(unix_timestamp(('2017-02-23 13:45:00')," +
          "'yyyy-MM-dd hh:mm:ss'))").show(false)

Can some one please help?


Solution

  • You should be using capital h as HH:mm:ss

    spark.sql("select from_unixtime(unix_timestamp(('2017-02-23 12:45:00'),'yyyy-MM-dd HH:mm:ss')) AS date").show(false)
    

    which should give you

    +-------------------+
    |date               |
    +-------------------+
    |2017-02-23 12:45:00|
    +-------------------+
    

    You can get more info here