如何在spark scala的df列中进行Luhn检查

How to do Luhn check in df column in spark scala

df 有一个字符串列,如“100256437”。我想再添加一列来检查它是否通过了 Luhn。如果通过,点亮(真),否则点亮(假)

  def Mod10(c: Column): Column = {
    var (odd, sum) = (true, 0)
 
    for (int <- c.reverse.map { _.toString.toShort }) {
      println(int)
      if (odd) sum += int
      else sum += (int * 2 % 10) + (int / 5)
      odd = !odd
    }
    lit(sum % 10 === 0)
  }

错误:

error: value reverse is not a member of org.apache.spark.sql.Column
    for (int <- c.reverse.map { _.toString.toShort }) {
                  ^
error: value === is not a member of Int
    lit(sum % 10 === 0)
                 ^

看起来,您正在处理 Spark 数据帧。

假设你有这个数据框

val df = List("100256437", "79927398713").toDF()

df.show()
+-----------+
|      value|
+-----------+
|  100256437|
|79927398713|
+-----------+

现在,您可以将此 Luhn 测试实现为 UDF,

val isValidLuhn = udf { (s: String) =>
  val array = s.toCharArray.map(_.toString.toInt)

  val len = array.length

  var i = 1
  while (i < len) {
    if (i % 2 == 0) {
      var updated = array(len - i) * 2
      while (updated > 9) {
        updated = updated.toString.toCharArray.map(_.toString.toInt).sum
      }
      array(len - i) = updated
    }
    i = i + 1
  }

  val sum = array.sum

  println(array.toList)

  (sum % 10) == 0
}

可以用作,

val dfWithLuhnCheck = df.withColumn("isValidLuhn", isValidLuhn(col("value")))

dfWithLuhnCheck.show()
+-----------+-----------+
|      value|isValidLuhn|
+-----------+-----------+
|  100256437|       true|
|79927398713|       true|
+-----------+-----------+