Search code examples
dataframescalaapache-sparkapache-spark-sqlluhn

How to do Luhn check in df column in Spark Scala


df has one string column like "100256437". I want to add one more column to check whether it would pass Luhn. If pass, lit(true), else lit(false).

  def Mod10(c: Column): Column = {
    var (odd, sum) = (true, 0)
 
    for (int <- c.reverse.map { _.toString.toShort }) {
      println(int)
      if (odd) sum += int
      else sum += (int * 2 % 10) + (int / 5)
      odd = !odd
    }
    lit(sum % 10 === 0)
  }

Error:

error: value reverse is not a member of org.apache.spark.sql.Column
    for (int <- c.reverse.map { _.toString.toShort }) {
                  ^
error: value === is not a member of Int
    lit(sum % 10 === 0)
                 ^

Solution

  • Looks like, you are dealing with Spark Dataframes.

    Lets say you have this dataframe

    val df = List("100256437", "79927398713").toDF()
    
    df.show()
    
    +-----------+
    |      value|
    +-----------+
    |  100256437|
    |79927398713|
    +-----------+
    

    Now, you can implement this Luhn test as an UDF,

    val isValidLuhn = udf { (s: String) =>
      val array = s.toCharArray.map(_.toString.toInt)
    
      val len = array.length
    
      var i = 1
      while (i < len) {
        if (i % 2 == 0) {
          var updated = array(len - i) * 2
          while (updated > 9) {
            updated = updated.toString.toCharArray.map(_.toString.toInt).sum
          }
          array(len - i) = updated
        }
        i = i + 1
      }
    
      val sum = array.sum
    
      println(array.toList)
    
      (sum % 10) == 0
    }
    

    Which can be used as,

    val dfWithLuhnCheck = df.withColumn("isValidLuhn", isValidLuhn(col("value")))
    
    dfWithLuhnCheck.show()
    
    +-----------+-----------+
    |      value|isValidLuhn|
    +-----------+-----------+
    |  100256437|       true|
    |79927398713|       true|
    +-----------+-----------+