Sqoop create job 抱怨参数
Sqoop create job is complaining about the arguments
当我 运行 下面的 sqoop 导入作业时,它工作得很好。
sqoop import -libjars ${JARS} --driver ${DRIVER}
--connect ${URL} -m 1 --hive-overwrite --hive-import
--hive-database ${Database} --hive-table Table
--target-dir '/tmp/Table' --as-parquetfile
--query "select cl1, c12, c13 from sourceSchema.sourceTable WHERE 1=1 AND $CONDITIONS"
当我尝试为同一个导入创建 sqoop 作业时,它抱怨在解析参数时出错
创建 sqoop 作业
sqoop job --create SomeJobName -- import -libjars ${JARS}
--driver ${DRIVER} --connect ${URL} -m 1
--hive-overwrite --hive-import
--hive-database ${Database}
--hive-table Table --target-dir '/tmp/Table' --as-parquetfile
--query "select cl1, c12, c13 from sourceSchema.sourceTable WHERE 1=1 AND $CONDITIONS"
这是我遇到的错误:
Please set $ACCUMULO_HOME to the root of your Accumulo installation.
18/02/15 10:55:56 INFO sqoop.Sqoop: Running Sqoop version: 1.4.6-cdh5.12.1
18/02/15 10:55:57 ERROR tool.BaseSqoopTool: Error parsing arguments for import:
18/02/15 10:55:57 ERROR tool.BaseSqoopTool: Unrecognized argument: -libjars
我能够通过在通用参数中添加 -libjars
参数来解决问题。通过查看 sqoop doc
中的语法 sqoop job (generic-args) (job-args) [-- [subtool-name] (subtool-args)]
了解
sqoop job -libjars /var/lib/sqoop/some.jar,/var/lib/sqoop/some.jar--create SomeJobName -- import
--driver ${DRIVER} --connect ${URL} -m 1
--hive-overwrite --hive-import
--hive-database ${Database}
--hive-table Table --target-dir '/tmp/Table' --as-parquetfile
--query "select cl1, c12, c13 from sourceSchema.sourceTable WHERE 1=1 AND $CONDITIONS"
当我 运行 下面的 sqoop 导入作业时,它工作得很好。
sqoop import -libjars ${JARS} --driver ${DRIVER}
--connect ${URL} -m 1 --hive-overwrite --hive-import
--hive-database ${Database} --hive-table Table
--target-dir '/tmp/Table' --as-parquetfile
--query "select cl1, c12, c13 from sourceSchema.sourceTable WHERE 1=1 AND $CONDITIONS"
当我尝试为同一个导入创建 sqoop 作业时,它抱怨在解析参数时出错
创建 sqoop 作业
sqoop job --create SomeJobName -- import -libjars ${JARS}
--driver ${DRIVER} --connect ${URL} -m 1
--hive-overwrite --hive-import
--hive-database ${Database}
--hive-table Table --target-dir '/tmp/Table' --as-parquetfile
--query "select cl1, c12, c13 from sourceSchema.sourceTable WHERE 1=1 AND $CONDITIONS"
这是我遇到的错误:
Please set $ACCUMULO_HOME to the root of your Accumulo installation.
18/02/15 10:55:56 INFO sqoop.Sqoop: Running Sqoop version: 1.4.6-cdh5.12.1
18/02/15 10:55:57 ERROR tool.BaseSqoopTool: Error parsing arguments for import:
18/02/15 10:55:57 ERROR tool.BaseSqoopTool: Unrecognized argument: -libjars
我能够通过在通用参数中添加 -libjars
参数来解决问题。通过查看 sqoop doc
sqoop job (generic-args) (job-args) [-- [subtool-name] (subtool-args)]
了解
sqoop job -libjars /var/lib/sqoop/some.jar,/var/lib/sqoop/some.jar--create SomeJobName -- import
--driver ${DRIVER} --connect ${URL} -m 1
--hive-overwrite --hive-import
--hive-database ${Database}
--hive-table Table --target-dir '/tmp/Table' --as-parquetfile
--query "select cl1, c12, c13 from sourceSchema.sourceTable WHERE 1=1 AND $CONDITIONS"