Search code examples
rapache-sparksparkr

if null replace with 0, otherwise default value in same column


In SparkR shell 1.5.0, Created a sample data set:

df_test <- createDataFrame(sqlContext, data.frame(mon = c(1,2,3,4,5), year = c(2011,2012,2013,2014,2015)))
df_test1 <- createDataFrame(sqlContext, data.frame(mon1 = c(1,2,3,4,5,6,7,8)))
df_test2 <- join(df_test1, df_test, joinExpr = df_test1$mon1 == df_test$mon, joinType = "left_outer")

data set : df_test2

+----+----+------+
|mon1| mon|  year|
+----+----+------+
| 7.0|null|  null|
| 1.0| 1.0|2011.0|
| 6.0|null|  null|
| 3.0| 3.0|2013.0|
| 5.0| 5.0|2015.0|
| 8.0|null|  null|
| 4.0| 4.0|2014.0|
| 2.0| 2.0|2012.0|
+----+----+------+

Question: If there is null how can I replace it with 0 in column df_test2$year or else use a default value?

The output should look like this,

+----+----+------+
|mon1| mon|  year|
+----+----+------+
| 7.0|null|  0   |
| 1.0| 1.0|2011.0|
| 6.0|null|  0   |
| 3.0| 3.0|2013.0|
| 5.0| 5.0|2015.0|
| 8.0|null|  0   |
| 4.0| 4.0|2014.0|
| 2.0| 2.0|2012.0|
+----+----+------+

I have used otherwise/when, but doesn't work

df_test2$year <- otherwise(when(isNull(df_test2$year), 0 ), df_test2$year)

It throw ed error,

Error in rep(yes, length.out = length(ans)) :
  attempt to replicate an object of type 'environment'

Solution

  • I have used raw SQL case when expression to get the answer,

    df_test3 <- sql(sqlContext, "select mon1, mon, case when year is null then 0 else year end year FROM temp")
    
    showDF(df_test3)
    +----+----+------+
    |mon1| mon|  year|
    +----+----+------+
    | 7.0|null|   0.0|
    | 1.0| 1.0|2011.0|
    | 6.0|null|   0.0|
    | 3.0| 3.0|2013.0|
    | 5.0| 5.0|2015.0|
    | 8.0|null|   0.0|
    | 4.0| 4.0|2014.0|
    | 2.0| 2.0|2012.0|
    +----+----+------+
    

    Even though it gives the answer, i am looking for pure sparkR code.