使用 Scala 在 Apache Spark 中将 Matrix 转换为 RowMatrix

Convert Matrix to RowMatrix in Apache Spark using Scala

我真的很想将我的 org.apache.spark.mllib.linalg.Matrix 转换为 org.apache.spark.mllib.linalg.distributed.RowMatrix

我可以这样做:

val xx = X.computeGramianMatrix()  //xx is type org.apache.spark.mllib.linalg.Matrix
val xxs = xx.toString()
val xxr = xxs.split("\n").map(row => row.replace("   "," ").replace("  "," ").replace("  "," ").replace("  "," ").replace(" ",",").split(","))
val xxp = sc.parallelize(xxr)
val xxd = xxp.map(ar => Vectors.dense(ar.map(elm => elm.toDouble)))
val xxrm: RowMatrix = new RowMatrix(xxd)

但是,这真的很恶心,完全是黑客。有人可以告诉我更好的方法吗?

注意我使用的是 Spark 版本 1.3.0

我建议您将 Matrix 转换为 RDD[Vector],稍后您可以自动将其转换为 RowMatrix

那么,让我们考虑以下示例:

import org.apache.spark.rdd._
import org.apache.spark.mllib.linalg._


val denseData = Seq(
  Vectors.dense(0.0, 1.0, 2.0),
  Vectors.dense(3.0, 4.0, 5.0),
  Vectors.dense(6.0, 7.0, 8.0),
  Vectors.dense(9.0, 0.0, 1.0)
)

val dm: Matrix = Matrices.dense(3, 2, Array(1.0, 3.0, 5.0, 2.0, 4.0, 6.0))

我们需要定义一个方法来将 Matrix 转换为 RDD[Vector] :

def matrixToRDD(m: Matrix): RDD[Vector] = {
   val columns = m.toArray.grouped(m.numRows)
   val rows = columns.toSeq.transpose // Skip this if you want a column-major RDD.
   val vectors = rows.map(row => new DenseVector(row.toArray))
   sc.parallelize(vectors)
}

现在我们可以在主 Matrix 上应用该转换:

 import org.apache.spark.mllib.linalg.distributed.RowMatrix
 val rows = matrixToRDD(dm)
 val mat = new RowMatrix(rows)

上面代码中的小修正:我们需要使用 Vectors.dense 而不是 new DenseVector

val vectors = rows.map(row =>  Vectors.dense(row.toArray))