spark-shell 中的 RDD 输出在思想上与 print(RDD) 不同
RDD output in spark-shell differs from print(RDD) in idea
val rddData1 = sc.makeRDD(1 to 10, 2)
println(rddData1.glom.collect)
idea或spark-shell中的代码会输出[[I@34a0ef00
但是 spark-shell 中的 rddData1.glom.collect
将输出 Array[Array[Int]] = Array(Array(1, 2, 3, 4, 5), Array(6, 7, 8、9、10))
如何在 idea
val conf = new SparkConf()
.setMaster("local[*]")
.setAppName("myAppName")
val sc = new SparkContext(conf)
val rddData1 = sc.makeRDD(1 to 10,2)
val rddData2 = sc.makeRDD(20 to 25,2)
println(rddData1.glom().collect())
println(java.util.Arrays.deepToString( rddData1.glom().collect().map(_.asInstanceOf[Object])))
// output [[1, 2, 3, 4, 5], [6, 7, 8, 9, 10]]
// 结束任务
sc.stop()
val conf = new SparkConf()
.setMaster("local[*]")
.setAppName("myAppName")
val sc = new SparkContext(conf)
val rddData1 = sc.makeRDD(1 to 10,2)
println(java.util.Arrays.toString( rddData1.collect()))
// output [1, 2, 3, 4, 5, 6, 7, 8, 9, 10]
// 结束任务
sc.stop()
val rddData1 = sc.makeRDD(1 to 10, 2)
println(rddData1.glom.collect)
idea或spark-shell中的代码会输出[[I@34a0ef00
但是 spark-shell 中的 rddData1.glom.collect
将输出 Array[Array[Int]] = Array(Array(1, 2, 3, 4, 5), Array(6, 7, 8、9、10))
如何在 idea
val conf = new SparkConf()
.setMaster("local[*]")
.setAppName("myAppName")
val sc = new SparkContext(conf)
val rddData1 = sc.makeRDD(1 to 10,2)
val rddData2 = sc.makeRDD(20 to 25,2)
println(rddData1.glom().collect())
println(java.util.Arrays.deepToString( rddData1.glom().collect().map(_.asInstanceOf[Object])))
// output [[1, 2, 3, 4, 5], [6, 7, 8, 9, 10]]
// 结束任务
sc.stop()
val conf = new SparkConf()
.setMaster("local[*]")
.setAppName("myAppName")
val sc = new SparkContext(conf)
val rddData1 = sc.makeRDD(1 to 10,2)
println(java.util.Arrays.toString( rddData1.collect()))
// output [1, 2, 3, 4, 5, 6, 7, 8, 9, 10]
// 结束任务
sc.stop()