Zeppelin 中没有名为 'pyspark' 的模块
No module named 'pyspark' in Zeppelin
我是 Spark 的新手,刚开始使用它。尝试从 pyspark 导入 SparkSession 但它会引发错误:“没有名为 'pyspark' 的模块。请在下面查看我的代码。
# Import our SparkSession so we can use it
from pyspark.sql import SparkSession
# Create our SparkSession, this can take a couple minutes locally
spark = SparkSession.builder.appName("basics").getOrCreate()```
Error:
```---------------------------------------------------------------------------
ModuleNotFoundError Traceback (most recent call last)
<ipython-input-2-6ce0f5f13dc0> in <module>
1 # Import our SparkSession so we can use it
----> 2 from pyspark.sql import SparkSession
3 # Create our SparkSession, this can take a couple minutes locally
4 spark = SparkSession.builder.appName("basics").getOrCreate()
ModuleNotFoundError: No module named 'pyspark'```
I am in my conda env and I tried ```pip install pyspark``` but I already have it.
如果您正在使用 Zepl, they have their own specific way of importing。这是有道理的,他们需要自己的语法,因为他们在云端 运行。它阐明了它们与 Python 本身的特定语法。例如%spark.pyspark
.
%spark.pyspark
from pyspark.sql import SparkSession
我是 Spark 的新手,刚开始使用它。尝试从 pyspark 导入 SparkSession 但它会引发错误:“没有名为 'pyspark' 的模块。请在下面查看我的代码。
# Import our SparkSession so we can use it
from pyspark.sql import SparkSession
# Create our SparkSession, this can take a couple minutes locally
spark = SparkSession.builder.appName("basics").getOrCreate()```
Error:
```---------------------------------------------------------------------------
ModuleNotFoundError Traceback (most recent call last)
<ipython-input-2-6ce0f5f13dc0> in <module>
1 # Import our SparkSession so we can use it
----> 2 from pyspark.sql import SparkSession
3 # Create our SparkSession, this can take a couple minutes locally
4 spark = SparkSession.builder.appName("basics").getOrCreate()
ModuleNotFoundError: No module named 'pyspark'```
I am in my conda env and I tried ```pip install pyspark``` but I already have it.
如果您正在使用 Zepl, they have their own specific way of importing。这是有道理的,他们需要自己的语法,因为他们在云端 运行。它阐明了它们与 Python 本身的特定语法。例如%spark.pyspark
.
%spark.pyspark
from pyspark.sql import SparkSession