H2O 隐式转换抛出编译错误
H2O implicit conversion throws compilation error
下面的代码在分配 H2OFrame 时抛出错误,很可能是隐式转换出了问题。错误是:
type mismatch; found : org.apache.spark.h2o.RDD[Int] (which expands
to) org.apache.spark.rdd.RDD[Int] required:
org.apache.spark.h2o.H2OFrame (which expands to) water.fvec.H2OFrame
和代码:
import org.apache.spark.h2o._
import org.apache.spark._
import org.apache.spark.SparkContext._
object App1 extends App{
val conf = new SparkConf()
conf.setAppName("Test")
conf.setMaster("local[1]")
conf.set("spark.executor.memory","1g");
val sc = new SparkContext(conf)
val rawData = sc.textFile("c:\spark\data.csv")
val data = rawData.map(line => line.split(',').map(_.toDouble))
val response: RDD[Int] = data.map(row => row(0).toInt)
val h2oResponse: H2OFrame = response // <-- this line throws the error
sc.stop
}
你所缺少的只是 h2oContext 的隐式 as
import h2oContext.implicits._
val h2oResponse: H2OFrame = response.toDF()
下面的代码在分配 H2OFrame 时抛出错误,很可能是隐式转换出了问题。错误是:
type mismatch; found : org.apache.spark.h2o.RDD[Int] (which expands to) org.apache.spark.rdd.RDD[Int] required: org.apache.spark.h2o.H2OFrame (which expands to) water.fvec.H2OFrame
和代码:
import org.apache.spark.h2o._
import org.apache.spark._
import org.apache.spark.SparkContext._
object App1 extends App{
val conf = new SparkConf()
conf.setAppName("Test")
conf.setMaster("local[1]")
conf.set("spark.executor.memory","1g");
val sc = new SparkContext(conf)
val rawData = sc.textFile("c:\spark\data.csv")
val data = rawData.map(line => line.split(',').map(_.toDouble))
val response: RDD[Int] = data.map(row => row(0).toInt)
val h2oResponse: H2OFrame = response // <-- this line throws the error
sc.stop
}
你所缺少的只是 h2oContext 的隐式 as
import h2oContext.implicits._
val h2oResponse: H2OFrame = response.toDF()