Spark SQL 函数任意类型

Spark SQL function any type

我不能设计一个函数来做任何数字类型:

def array_add_Int(x: WrappedArray[Int], y: WrappedArray[Int]): WrappedArray[Int] = {
    require(x.length == y.length, "ERROR:  cannot operate on arrays of different ranges of dimensions.")    
    x.zipAll(y,0,0).map(pair=>pair._1+pair._2)
}

def array_add_Long(x: WrappedArray[Long], y: WrappedArray[Long]): WrappedArray[Long] = {
    require(x.length == y.length, "ERROR:  cannot operate on arrays of different ranges of dimensions.")    
    x.zipAll(y,0,0).map(pair => pair._1.asInstanceOf[Number].longValue() + pair._2.asInstanceOf[Number].longValue())
}

我尝试附加两个函数的return如下:

 def array_add[T](itemX:Traversable[T], itemY:Traversable[T])(implicit n:Numeric[T]) = {
    require(itemX.size == itemY.size, "ERROR:  cannot operate on arrays of different ranges of dimensions.")
    itemX.toSeq.zipAll(itemY.toSeq, 0, 0).map(pair => pair._1 + pair._2)
 }

它只适用于 Int 类型。有什么想法吗?

您需要使用您要求的数字类型类作为隐式参数:

def array_add[T](itemX:Traversable[T], itemY:Traversable[T])(implicit n:Numeric[T]) = {
  require(itemX.size == itemY.size, "ERROR:  cannot operate on arrays of different ranges of dimensions.")
  itemX.toSeq.zipAll(itemY.toSeq, n.zero, n.zero).map(pair => n.plus(pair._1, pair._2) )
}

注意,您可以使用 n.fromInt() 而不是 n.zero 来处理 0

以外的值