为什么在 spark-shell 中导入 SparkSession 会失败并显示 "object SparkSession is not a member of package org.apache.spark.sql"?
Why does importing SparkSession in spark-shell fail with "object SparkSession is not a member of package org.apache.spark.sql"?
我在我的 VM Cloudera 机器上使用 Spark 1.6.0。
我正在尝试从 Spark shell 向 Hive table 输入一些数据。
为此,我正在尝试使用 SparkSession。但是下面的导入不起作用。
scala> import org.apache.spark.sql.SparkSession
<console>:33: error: object SparkSession is not a member of package org.apache.spark.sql
import org.apache.spark.sql.SparkSession
没有它,我无法执行此语句:
val spark = SparkSession.builder.master("local[2]").enableHiveSupport().config("hive.exec.dynamic.partition","true").config("hive.exec.dynamic.partition.mode", "nonstrict").config("spark.sql.warehouse.dir", warehouseLocation).config("hive.metastore.warehouse.dir","/user/hive/warehouse").getOrCreate()
<console>:33: error: not found: value SparkSession
val spark = SparkSession.builder.master("local[2]").enableHiveSupport().config("hive.exec.dynamic.partition","true").config("hive.exec.dynamic.partition.mode", "nonstrict").config("spark.sql.warehouse.dir", warehouseLocation).config("hive.metastore.warehouse.dir","/user/hive/warehouse").getOrCreate()
谁能告诉我我在这里犯了什么错误?
SparkSession
is available as of Spark 2.0 so you should be using SQLContext
instead (or upgrade your Spark to the latest and greatest 2.1.1).
引用 Spark 1.6.0 的 Starting Point: SQLContext:
The entry point into all functionality in Spark SQL is the SQLContext
class, or one of its descendants.
In addition to the basic SQLContext, you can also create a HiveContext, which provides a superset of the functionality provided by the basic SQLContext.
我在我的 VM Cloudera 机器上使用 Spark 1.6.0。
我正在尝试从 Spark shell 向 Hive table 输入一些数据。 为此,我正在尝试使用 SparkSession。但是下面的导入不起作用。
scala> import org.apache.spark.sql.SparkSession
<console>:33: error: object SparkSession is not a member of package org.apache.spark.sql
import org.apache.spark.sql.SparkSession
没有它,我无法执行此语句:
val spark = SparkSession.builder.master("local[2]").enableHiveSupport().config("hive.exec.dynamic.partition","true").config("hive.exec.dynamic.partition.mode", "nonstrict").config("spark.sql.warehouse.dir", warehouseLocation).config("hive.metastore.warehouse.dir","/user/hive/warehouse").getOrCreate()
<console>:33: error: not found: value SparkSession
val spark = SparkSession.builder.master("local[2]").enableHiveSupport().config("hive.exec.dynamic.partition","true").config("hive.exec.dynamic.partition.mode", "nonstrict").config("spark.sql.warehouse.dir", warehouseLocation).config("hive.metastore.warehouse.dir","/user/hive/warehouse").getOrCreate()
谁能告诉我我在这里犯了什么错误?
SparkSession
is available as of Spark 2.0 so you should be using SQLContext
instead (or upgrade your Spark to the latest and greatest 2.1.1).
引用 Spark 1.6.0 的 Starting Point: SQLContext:
The entry point into all functionality in Spark SQL is the
SQLContext
class, or one of its descendants.In addition to the basic SQLContext, you can also create a HiveContext, which provides a superset of the functionality provided by the basic SQLContext.