Oozie 4.0.0 构建错误 - 无法解析 hcatalog 依赖项
Oozie 4.0.0 build error - could not resolve hcatalog dependencies
Hadoop 版本 1.2.1
Maven 版本 - 3.0.5
配置单元版本 - 0.14.0
猪版本 - 0.14.0
当我开始使用以下命令构建 oozie 时
./mkdistro.sh -DskipTests
我遇到以下错误
[INFO] Apache Oozie Share Lib Sqoop ...................... SKIPPED
[INFO] Apache Oozie Share Lib Streaming .................. SKIPPED
[INFO] Apache Oozie Share Lib Distcp ..................... SKIPPED
[INFO] Apache Oozie WebApp ............................... SKIPPED
[INFO] Apache Oozie Examples ............................. SKIPPED
[INFO] Apache Oozie Share Lib ............................ SKIPPED
[INFO] Apache Oozie Tools ................................ SKIPPED
[INFO] Apache Oozie MiniOozie ............................ SKIPPED
[INFO] Apache Oozie Distro ............................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 1:13.847s
[INFO] Finished at: Sun Aug 09 13:22:12 IST 2015
[INFO] Final Memory: 38M/273M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal on project oozie-hcatalog: Could not resolve dependencies for project org.apache.oozie:oozie-hcatalog:jar:0.5.0.oozie-4.0.0: Failed to collect dependencies for [org.apache.hcatalog:hcatalog-server-extensions:jar:0.5.0-incubating (compile), org.apache.hcatalog:hcatalog-core:jar:0.5.0-incubating (compile), org.apache.hcatalog:webhcat-java-client:jar:0.5.0-incubating (compile), org.apache.hive:hive-common:jar:0.14.0 (compile), org.apache.hive:hive-metastore:jar:0.14.0 (compile), org.apache.hive:hive-exec:jar:0.14.0 (compile), org.apache.hive:hive-serde:jar:0.14.0 (compile), org.apache.thrift:libfb303:jar:0.7.0 (compile), org.codehaus.jackson:jackson-core-asl:jar:1.8.8 (compile), org.codehaus.jackson:jackson-mapper-asl:jar:1.8.8 (compile)]: Failed to read artifact descriptor for org.apache.hive:hive-builtins:jar:0.14.0: Could not transfer artifact org.apache.hive:hive-builtins:pom:0.14.0 from/to Codehaus repository (http://repository.codehaus.org/): repository.codehaus.org: Name or service not known: Unknown host repository.codehaus.org: Name or service not known -> [Help 1]
错误似乎无法解决对 hcatalog 的依赖关系。但是由于我使用的是 hive-0.14,其中 hcatalog 是在 hive 内部构建的,所以有没有办法排除 hcatalog 依赖性。或者错误提示其他问题如何解决?
"hcatalog is built inside hive"
不完全正确:HCatalog 允许任何应用程序(Pig、Spark、Sqoop 等)访问 Hive Metastore;它通常与 Hive 安装工具包捆绑在一起,但可以提取并使用 w/o 其余的 Hive 库。
事实上,Oozie 为 Hive(Hive 操作的默认值)和 HCatalog(附加和可选的 ShareLib 捆绑了两个不同的 ShareLib,用于需要访问 Hive 表的 Pig/Spark/Sqoop 操作)。
我通过禁用对 repository.codehaus.org 的依赖来解决这个问题,因为回购协议在 5 月之后不可用。而其他的依赖是从maven仓库中获取的。
Hadoop 版本 1.2.1
Maven 版本 - 3.0.5
配置单元版本 - 0.14.0
猪版本 - 0.14.0
当我开始使用以下命令构建 oozie 时
./mkdistro.sh -DskipTests
我遇到以下错误
[INFO] Apache Oozie Share Lib Sqoop ...................... SKIPPED
[INFO] Apache Oozie Share Lib Streaming .................. SKIPPED
[INFO] Apache Oozie Share Lib Distcp ..................... SKIPPED
[INFO] Apache Oozie WebApp ............................... SKIPPED
[INFO] Apache Oozie Examples ............................. SKIPPED
[INFO] Apache Oozie Share Lib ............................ SKIPPED
[INFO] Apache Oozie Tools ................................ SKIPPED
[INFO] Apache Oozie MiniOozie ............................ SKIPPED
[INFO] Apache Oozie Distro ............................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 1:13.847s
[INFO] Finished at: Sun Aug 09 13:22:12 IST 2015
[INFO] Final Memory: 38M/273M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal on project oozie-hcatalog: Could not resolve dependencies for project org.apache.oozie:oozie-hcatalog:jar:0.5.0.oozie-4.0.0: Failed to collect dependencies for [org.apache.hcatalog:hcatalog-server-extensions:jar:0.5.0-incubating (compile), org.apache.hcatalog:hcatalog-core:jar:0.5.0-incubating (compile), org.apache.hcatalog:webhcat-java-client:jar:0.5.0-incubating (compile), org.apache.hive:hive-common:jar:0.14.0 (compile), org.apache.hive:hive-metastore:jar:0.14.0 (compile), org.apache.hive:hive-exec:jar:0.14.0 (compile), org.apache.hive:hive-serde:jar:0.14.0 (compile), org.apache.thrift:libfb303:jar:0.7.0 (compile), org.codehaus.jackson:jackson-core-asl:jar:1.8.8 (compile), org.codehaus.jackson:jackson-mapper-asl:jar:1.8.8 (compile)]: Failed to read artifact descriptor for org.apache.hive:hive-builtins:jar:0.14.0: Could not transfer artifact org.apache.hive:hive-builtins:pom:0.14.0 from/to Codehaus repository (http://repository.codehaus.org/): repository.codehaus.org: Name or service not known: Unknown host repository.codehaus.org: Name or service not known -> [Help 1]
错误似乎无法解决对 hcatalog 的依赖关系。但是由于我使用的是 hive-0.14,其中 hcatalog 是在 hive 内部构建的,所以有没有办法排除 hcatalog 依赖性。或者错误提示其他问题如何解决?
"hcatalog is built inside hive"
不完全正确:HCatalog 允许任何应用程序(Pig、Spark、Sqoop 等)访问 Hive Metastore;它通常与 Hive 安装工具包捆绑在一起,但可以提取并使用 w/o 其余的 Hive 库。
事实上,Oozie 为 Hive(Hive 操作的默认值)和 HCatalog(附加和可选的 ShareLib 捆绑了两个不同的 ShareLib,用于需要访问 Hive 表的 Pig/Spark/Sqoop 操作)。
我通过禁用对 repository.codehaus.org 的依赖来解决这个问题,因为回购协议在 5 月之后不可用。而其他的依赖是从maven仓库中获取的。