当我尝试 运行 pyspark.cmd 时,我收到错误消息 "find: 'version': No such file or directory"
When I try and run pyspark.cmd I get the error message "find: 'version': No such file or directory"
我正在尝试开始使用 Apache Spark。我想通过 python 使用它。但是,当我从命令行 运行 pyspark 时,我收到以下错误消息:
C:\Programs\Apache\Spark\spark-1.2.0-bin-hadoop2.4\bin>pyspark.cmd
Running python with PYTHONPATH=C:\Programs\Apache\Spark\spark-1.2.0-bin-hadoop2.
4\bin\..\python\lib\py4j-0.8.2.1-src.zip;C:\Programs\Apache\Spark\spark-1.2.0-bi
n-hadoop2.4\bin\..\python;
Python 2.7.8 |Anaconda 2.1.0 (32-bit)| (default, Jul 2 2014, 15:13:35) [MSC v.1
500 32 bit (Intel)] on win32
Type "help", "copyright", "credits" or "license" for more information.
Anaconda is brought to you by Continuum Analytics.
Please check out: http://continuum.io/thanks and https://binstar.org
find: 'version': No such file or directory
else was unexpected at this time.
Traceback (most recent call last):
File "C:\Programs\Apache\Spark\spark-1.2.0-bin-hadoop2.4\bin\..\python\pyspark
\shell.py", line 45, in <module>
sc = SparkContext(appName="PySparkShell", pyFiles=add_files)
File "C:\Programs\Apache\Spark\spark-1.2.0-bin-hadoop2.4\python\pyspark\contex
t.py", line 102, in __init__
SparkContext._ensure_initialized(self, gateway=gateway)
File "C:\Programs\Apache\Spark\spark-1.2.0-bin-hadoop2.4\python\pyspark\contex
t.py", line 211, in _ensure_initialized
SparkContext._gateway = gateway or launch_gateway()
File "C:\Programs\Apache\Spark\spark-1.2.0-bin-hadoop2.4\python\pyspark\java_g
ateway.py", line 73, in launch_gateway
raise Exception(error_msg)
Exception: Launching GatewayServer failed with exit code 255!
Warning: Expected GatewayServer to output a port, but found no output.
当我通过 运行ning spark-shell 尝试 运行 scala 界面时,我收到消息:
find: 'version': No such file or directory
else was unexpected at this time.
除了
之外,我在网上找不到关于此错误的任何信息
原来这是一条死胡同。
https://issues.apache.org/jira/browse/SPARK-3808
请帮忙!
我在 spark 1.2.0 中遇到了同样的问题,但在 spark 1.0.2 中没有。
在我的例子中,原因是我在 DOS 类路径中有 cygwin。
Spark 使用文件 'spark-class2.cmd' 中的 find 命令,它使用了 cygwin find 命令而不是 DOS find 命令,后者的工作方式有些不同。
我从 DOS 路径中删除了 cygwin,这解决了问题。
此致,菲利克斯
我正在尝试开始使用 Apache Spark。我想通过 python 使用它。但是,当我从命令行 运行 pyspark 时,我收到以下错误消息:
C:\Programs\Apache\Spark\spark-1.2.0-bin-hadoop2.4\bin>pyspark.cmd
Running python with PYTHONPATH=C:\Programs\Apache\Spark\spark-1.2.0-bin-hadoop2.
4\bin\..\python\lib\py4j-0.8.2.1-src.zip;C:\Programs\Apache\Spark\spark-1.2.0-bi
n-hadoop2.4\bin\..\python;
Python 2.7.8 |Anaconda 2.1.0 (32-bit)| (default, Jul 2 2014, 15:13:35) [MSC v.1
500 32 bit (Intel)] on win32
Type "help", "copyright", "credits" or "license" for more information.
Anaconda is brought to you by Continuum Analytics.
Please check out: http://continuum.io/thanks and https://binstar.org
find: 'version': No such file or directory
else was unexpected at this time.
Traceback (most recent call last):
File "C:\Programs\Apache\Spark\spark-1.2.0-bin-hadoop2.4\bin\..\python\pyspark
\shell.py", line 45, in <module>
sc = SparkContext(appName="PySparkShell", pyFiles=add_files)
File "C:\Programs\Apache\Spark\spark-1.2.0-bin-hadoop2.4\python\pyspark\contex
t.py", line 102, in __init__
SparkContext._ensure_initialized(self, gateway=gateway)
File "C:\Programs\Apache\Spark\spark-1.2.0-bin-hadoop2.4\python\pyspark\contex
t.py", line 211, in _ensure_initialized
SparkContext._gateway = gateway or launch_gateway()
File "C:\Programs\Apache\Spark\spark-1.2.0-bin-hadoop2.4\python\pyspark\java_g
ateway.py", line 73, in launch_gateway
raise Exception(error_msg)
Exception: Launching GatewayServer failed with exit code 255!
Warning: Expected GatewayServer to output a port, but found no output.
当我通过 运行ning spark-shell 尝试 运行 scala 界面时,我收到消息:
find: 'version': No such file or directory
else was unexpected at this time.
除了
之外,我在网上找不到关于此错误的任何信息原来这是一条死胡同。 https://issues.apache.org/jira/browse/SPARK-3808 请帮忙!
我在 spark 1.2.0 中遇到了同样的问题,但在 spark 1.0.2 中没有。 在我的例子中,原因是我在 DOS 类路径中有 cygwin。 Spark 使用文件 'spark-class2.cmd' 中的 find 命令,它使用了 cygwin find 命令而不是 DOS find 命令,后者的工作方式有些不同。 我从 DOS 路径中删除了 cygwin,这解决了问题。
此致,菲利克斯