在 windows 7 32 位上安装 Apache Spark

Apache Spark installation on windows 7 32 bit

刚开始学习apache spark。我做的第一件事是尝试在我的机器上安装 spark。我用 hadoop 2.6 下载了预构建的 spark 1.5.2。当我 运行 spark shell 我得到以下错误

java.lang.RuntimeException: java.lang.NullPointerException
        at     org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:522)
        at org.apache.spark.sql.hive.client.ClientWrapper.<init>    (ClientWrapper.scala:171)
    at     org.apache.spark.sql.hive.HiveContext.executionHive$lzycompute(HiveContext.scala    :163)
        at     org.apache.spark.sql.hive.HiveContext.executionHive(HiveContext.scala:161)
        at org.apache.spark.sql.hive.HiveContext.<init>(HiveContext.scala:168)
        at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
        at sun.reflect.NativeConstructorAccessorImpl.newInstance(Unknown Source)
        at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(Unknown Source)
    at java.lang.reflect.Constructor.newInstance(Unknown Source)
    at org.apache.spark.repl.SparkILoop.createSQLContext(SparkILoop.scala:1028)
    at $iwC$$iwC.<init>(<console>:9)
    at $iwC.<init>(<console>:18)
    at <init>(<console>:20)
    at .<init>(<console>:24)
    at .<clinit>(<console>)
    at .<init>(<console>:7)
    at .<clinit>(<console>)
    at $print(<console>)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source)
    at java.lang.reflect.Method.invoke(Unknown Source)
    at org.apache.spark.repl.SparkIMain$ReadEvalPrint.call(SparkIMain.scala:1065)
    at org.apache.spark.repl.SparkIMain$Request.loadAndRun(SparkIMain.scala:1340)
    at org.apache.spark.repl.SparkIMain.loadAndRunReq(SparkIMain.scala:840)
    at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:871)
    at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:819)
    at org.apache.spark.repl.SparkILoop.reallyInterpret(SparkILoop.scala:857)
    at org.apache.spark.repl.SparkILoop.interpretStartingWith(SparkILoop.scala:902)
    at org.apache.spark.repl.SparkILoop.command(SparkILoop.scala:814)
    at org.apache.spark.repl.SparkILoopInit$$anonfun$initializeSpark.apply(SparkILoopInit.scala:132)
    at org.apache.spark.repl.SparkILoopInit$$anonfun$initializeSpark.apply(SparkILoopInit.scala:124)
    at org.apache.spark.repl.SparkIMain.beQuietDuring(SparkIMain.scala:324)
    at org.apache.spark.repl.SparkILoopInit$class.initializeSpark(SparkILoopInit.scala:124)
    at org.apache.spark.repl.SparkILoop.initializeSpark(SparkILoop.scala:64)
    at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$$anonfun$apply$mcZ$sp.apply$mcV$sp(SparkILoop.scala:974)

我搜索了这个错误,发现我必须下载 winutils.exe 我确实这样做了,我设置了路径 HADOOP_HOME = "c:\Hadoop" 然后 运行 命令

C:\Hadoop\bin\winutils.exe chmod 777 /tmp/hive

但我得到以下错误

This version of C:\Hadoop\bin\winutils.exe is not compatible with the version of
 Windows you're running. Check your computer's system information to see whether
 you need a x86 (32-bit) or x64 (64-bit) version of the program, and then contac
t the software publisher.

我试图搜索 winutils.exe 的 32 位版本,但我找不到它。请帮助我进行此安装。 提前谢谢你

以下链接可能会有帮助。

https://issues.apache.org/jira/browse/HADOOP-9922

https://issues.apache.org/jira/browse/HADOOP-11784

Not able to find winutils.exe for hadoop 2.6.0 for 32 bit windows