Search code examples
javafloating-pointapache-spark-sqldouble

How to round down Values in Spark sql


I am querying a postgres sql using the following query. I need to round down the value of min(months_between(current_date,somedate))

for ex: 13.83 should be13 and not 14, even 13.9 should be 13 as well. Is there any function that can round down this value insparksql? Any help will be highly appreciated.


Solution

  • You can use the floor() function. It rounds down its argument.

    Note that floor(x) gives the largest integer ≤x. So floor(5.8) returns 5, but floor(-5.8) return -6. If your values may be negative and you want to round them towards 0, you must test their sign and use either floor() or ceil() (that rounds to the upper value).

    Also note that casting a float to an int with int() rounds it towards zero whatever its sign. Not sure of the actual behavior in spark-sql, but it may solve also your problem.