使用 "discovery" 参数获取 com.sap.spark.vora.VoraConfigurationException
Getting com.sap.spark.vora.VoraConfigurationException with "discovery" parameter
我在 3 台机器的 SLES 11 SP3 上安装了 HDP 2.3.4 集群并安装了 Vora 1.2
终于让发现服务开始工作了。我可以在 http://myclustermachine:8500/ui/#/dc1/services 中验证它。另外,Vora Thriftserver 不会死。
所以我可以通过 Vora 安装指南第 34 页的“val vc = new SapSQLContext(sc)”这一行。但是当我尝试创建 table 时,我得到以下信息:
com.sap.spark.vora.VoraConfigurationException: Following parameter(s) are invalid: discovery
at com.sap.spark.vora.config.ParametersValidator$.checkSyntax(ParametersValidator.scala:280)
at com.sap.spark.vora.config.ParametersValidator$.apply(ParametersValidator.scala:98)
at com.sap.spark.vora.DefaultSource.createRelation(DefaultSource.scala:108)
at org.apache.spark.sql.execution.datasources.CreateTableUsingTemporaryAwareCommand.resolveDataSource(CreateTableUsingTemporaryAwareCommand.scala:59)
at org.apache.spark.sql.execution.datasources.CreateTableUsingTemporaryAwareCommand.run(CreateTableUsingTemporaryAwareCommand.scala:29)
at org.apache.spark.sql.execution.ExecutedCommand.sideEffectResult$lzycompute(commands.scala:57)
at org.apache.spark.sql.execution.ExecutedCommand.sideEffectResult(commands.scala:57)
at org.apache.spark.sql.execution.ExecutedCommand.doExecute(commands.scala:69)
at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute.apply(SparkPlan.scala:140)
at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute.apply(SparkPlan.scala:138)
at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:147)
at org.apache.spark.sql.execution.SparkPlan.execute(SparkPlan.scala:138)
at org.apache.spark.sql.SQLContext$QueryExecution.toRdd$lzycompute(SQLContext.scala:933)
at org.apache.spark.sql.SQLContext$QueryExecution.toRdd(SQLContext.scala:933)
at org.apache.spark.sql.DataFrame.<init>(DataFrame.scala:144)
at org.apache.spark.sql.DataFrame.<init>(DataFrame.scala:129)
at org.apache.spark.sql.DataFrame$.apply(DataFrame.scala:51)
这次可能出了什么问题?
显然这是我在 spark-defaults.conf 中为发现参数添加的一行:"spark.vora.discovery xxxxxxx:8500"
我把它去掉后,一切正常了。
我在 3 台机器的 SLES 11 SP3 上安装了 HDP 2.3.4 集群并安装了 Vora 1.2
终于让发现服务开始工作了。我可以在 http://myclustermachine:8500/ui/#/dc1/services 中验证它。另外,Vora Thriftserver 不会死。
所以我可以通过 Vora 安装指南第 34 页的“val vc = new SapSQLContext(sc)”这一行。但是当我尝试创建 table 时,我得到以下信息:
com.sap.spark.vora.VoraConfigurationException: Following parameter(s) are invalid: discovery
at com.sap.spark.vora.config.ParametersValidator$.checkSyntax(ParametersValidator.scala:280)
at com.sap.spark.vora.config.ParametersValidator$.apply(ParametersValidator.scala:98)
at com.sap.spark.vora.DefaultSource.createRelation(DefaultSource.scala:108)
at org.apache.spark.sql.execution.datasources.CreateTableUsingTemporaryAwareCommand.resolveDataSource(CreateTableUsingTemporaryAwareCommand.scala:59)
at org.apache.spark.sql.execution.datasources.CreateTableUsingTemporaryAwareCommand.run(CreateTableUsingTemporaryAwareCommand.scala:29)
at org.apache.spark.sql.execution.ExecutedCommand.sideEffectResult$lzycompute(commands.scala:57)
at org.apache.spark.sql.execution.ExecutedCommand.sideEffectResult(commands.scala:57)
at org.apache.spark.sql.execution.ExecutedCommand.doExecute(commands.scala:69)
at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute.apply(SparkPlan.scala:140)
at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute.apply(SparkPlan.scala:138)
at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:147)
at org.apache.spark.sql.execution.SparkPlan.execute(SparkPlan.scala:138)
at org.apache.spark.sql.SQLContext$QueryExecution.toRdd$lzycompute(SQLContext.scala:933)
at org.apache.spark.sql.SQLContext$QueryExecution.toRdd(SQLContext.scala:933)
at org.apache.spark.sql.DataFrame.<init>(DataFrame.scala:144)
at org.apache.spark.sql.DataFrame.<init>(DataFrame.scala:129)
at org.apache.spark.sql.DataFrame$.apply(DataFrame.scala:51)
这次可能出了什么问题?
显然这是我在 spark-defaults.conf 中为发现参数添加的一行:"spark.vora.discovery xxxxxxx:8500"
我把它去掉后,一切正常了。