对象 hbase 不是包 org.apache.spark.sql.execution.datasources 的成员
object hbase is not a member of package org.apache.spark.sql.execution.datasources
我正在尝试使用 Spark-Hbase-Connector 从 HBase 获取数据
import org.apache.spark.sql.execution.datasources.hbase._
错误是
object hbase is not a member of package org.apache.spark.sql.execution.datasources
在我的本地 .m2
存储库中已经存在 org.apache.hbase.hbase-spark
的 .jar
...我真的很想知道这个包在哪里(我想在这个包中使用的对象是HBaseTableCatalog
)
pom.xml
的部分是
<dependency>
<groupId>org.apache.hbase</groupId>
<artifactId>hbase-spark</artifactId>
<version>3.0.0-SNAPSHOT</version>
</dependency>
下面shc site中明确提到
Users can use the Spark-on-HBase connector as a standard Spark package. To include the package in your Spark application use:
Note: com.hortonworks:shc-core:1.1.1-2.1-s_2.11 has not been uploaded to spark-packages.org, but will be there soon.
spark-shell, pyspark, or spark-submit
$SPARK_HOME/bin/spark-shell --packages com.hortonworks:shc-core:1.1.1-2.1-s_2.11
Users can include the package as the dependency in your SBT file as well. The format is the spark-package-name:version in build.sbt file.
libraryDependencies += “com.hortonworks/shc-core:1.1.1-2.1-s_2.11”
因此,如果您使用 maven,则必须下载 jar 并将其手动包含到您的项目中以进行测试。
或者你可以试试maven uploaded shc
我正在尝试使用 Spark-Hbase-Connector 从 HBase 获取数据
import org.apache.spark.sql.execution.datasources.hbase._
错误是
object hbase is not a member of package org.apache.spark.sql.execution.datasources
在我的本地 .m2
存储库中已经存在 org.apache.hbase.hbase-spark
的 .jar
...我真的很想知道这个包在哪里(我想在这个包中使用的对象是HBaseTableCatalog
)
pom.xml
的部分是
<dependency>
<groupId>org.apache.hbase</groupId>
<artifactId>hbase-spark</artifactId>
<version>3.0.0-SNAPSHOT</version>
</dependency>
下面shc site中明确提到
Users can use the Spark-on-HBase connector as a standard Spark package. To include the package in your Spark application use: Note: com.hortonworks:shc-core:1.1.1-2.1-s_2.11 has not been uploaded to spark-packages.org, but will be there soon. spark-shell, pyspark, or spark-submit $SPARK_HOME/bin/spark-shell --packages com.hortonworks:shc-core:1.1.1-2.1-s_2.11 Users can include the package as the dependency in your SBT file as well. The format is the spark-package-name:version in build.sbt file. libraryDependencies += “com.hortonworks/shc-core:1.1.1-2.1-s_2.11”
因此,如果您使用 maven,则必须下载 jar 并将其手动包含到您的项目中以进行测试。
或者你可以试试maven uploaded shc