Scala: GraphX: error: class Array takes type parameters
Scala: GraphX: error: class Array takes type parameters
我正在尝试为 GraphX 构建边缘 RDD。我正在读取一个 csv 文件并转换为 DataFrame 然后尝试转换为 Edge RDD:
val staticDataFrame = spark.
read.
option("header", true).
option("inferSchema", true).
csv("/projects/pdw/aiw_test/aiw/haris/Customers_DDSW-withDN$.csv")
val edgeRDD: RDD[Edge[(VertexId, VertexId, String)]] =
staticDataFrame.select(
"dealer_customer_number",
"parent_dealer_cust_number",
"dealer_code"
).map{ (row: Array) =>
Edge((
row.getAs[Long]("dealer_customer_number"),
row.getAs[Long]("parent_dealer_cust_number"),
row("dealer_code")
))
}
但是我收到这个错误:
<console>:81: error: class Array takes type parameters
val edgeRDD: RDD[Edge[(VertexId, VertexId, String)]] = staticDataFrame.select("dealer_customer_number", "parent_dealer_cust_number", "dealer_code").map((row: Array) => Edge((row.getAs[Long]("dealer_customer_number"), row.getAs[Long]("parent_dealer_cust_number"), row("dealer_code"))))
^
的结果
staticDataFrame.select("dealer_customer_number", "parent_dealer_cust_number", "dealer_code").take(1)
是
res3: Array[org.apache.spark.sql.Row] = Array([0000101,null,B110])
首先,Array
接受类型参数,所以你必须写成Array[Something]
。但这可能不是你想要的。
数据框是 Dataset[Row]
,不是 Dataset[Array[_]]
,因此您必须更改
.map{ (row: Array) =>
到
.map{ (row: Row) =>
或者完全省略输入(应该是推断的):
.map{ row =>
我正在尝试为 GraphX 构建边缘 RDD。我正在读取一个 csv 文件并转换为 DataFrame 然后尝试转换为 Edge RDD:
val staticDataFrame = spark.
read.
option("header", true).
option("inferSchema", true).
csv("/projects/pdw/aiw_test/aiw/haris/Customers_DDSW-withDN$.csv")
val edgeRDD: RDD[Edge[(VertexId, VertexId, String)]] =
staticDataFrame.select(
"dealer_customer_number",
"parent_dealer_cust_number",
"dealer_code"
).map{ (row: Array) =>
Edge((
row.getAs[Long]("dealer_customer_number"),
row.getAs[Long]("parent_dealer_cust_number"),
row("dealer_code")
))
}
但是我收到这个错误:
<console>:81: error: class Array takes type parameters
val edgeRDD: RDD[Edge[(VertexId, VertexId, String)]] = staticDataFrame.select("dealer_customer_number", "parent_dealer_cust_number", "dealer_code").map((row: Array) => Edge((row.getAs[Long]("dealer_customer_number"), row.getAs[Long]("parent_dealer_cust_number"), row("dealer_code"))))
^
的结果
staticDataFrame.select("dealer_customer_number", "parent_dealer_cust_number", "dealer_code").take(1)
是
res3: Array[org.apache.spark.sql.Row] = Array([0000101,null,B110])
首先,Array
接受类型参数,所以你必须写成Array[Something]
。但这可能不是你想要的。
数据框是 Dataset[Row]
,不是 Dataset[Array[_]]
,因此您必须更改
.map{ (row: Array) =>
到
.map{ (row: Row) =>
或者完全省略输入(应该是推断的):
.map{ row =>