如何在 (ana)conda 环境中的 Jupyter 中安装 Apache Toree for Spark Kernel?
How to install Apache Toree for Spark Kernel in Jupyter in (ana)conda environment?
我正在尝试安装 Jupyter-support for Spark in a conda environment (which I set up using http://conda.pydata.org/docs/test-drive.html) of the anaconda distribution。
我正在尝试为此使用 apache toree as Jupyter Kernel。
这是我安装 Anaconda 后所做的:
conda create --name jupyter python=3
source activate jupyter
conda install jupyter
pip install --pre toree
jupyter toree install
在我到达最后一行之前一切正常。我得到了
PermissionError: [Errno 13] Permission denied: '/usr/local/share/jupyter'
这引出了一个问题:为什么它甚至在那个目录中查找?毕竟它应该留在环境中。因此我执行
jupyter --paths
并得到
config:
/home/user/.jupyter
~/anaconda2/envs/jupyter/etc/jupyter
/usr/local/etc/jupyter
/etc/jupyter
data:
/home/user/.local/share/jupyter
~/anaconda2/envs/jupyter/share/jupyter
/usr/local/share/jupyter
/usr/share/jupyter
runtime:
/run/user/1000/jupyter
我不太确定发生了什么以及如何继续在 conda 环境 "jupyter".
中(如果可能的话只在)中获取所有内容 运行
默认情况下,Jupyter 尝试将内核安装到系统范围的内核注册表中。您可以传递 --user 标志,它将使用用户内核目录。 kernelspec.py 中提供了更多详细信息。
以下命令将 toree 内核安装到用户内核中
jupyter toree install --user
您可以使用 --help
查看所有可用选项:
$ jupyter toree install --help
A Jupyter kernel for talking to spark
Options
-------
Arguments that take values are actually convenience aliases to full
Configurables, whose aliases are listed on the help line. For more information
on full configurables, see '--help-all'.
--user
Install to the per-user kernel registry
--replace
Replace any existing kernel spec with this name.
--sys-prefix
Install to Python's sys.prefix. Useful in conda/virtual environments.
--debug
set log level to logging.DEBUG (maximize logging output)
--kernel_name= (ToreeInstall.kernel_name)
Default: 'Apache Toree'
Install the kernel spec with this name. This is also used as the base of the
display name in jupyter.
--spark_home= (ToreeInstall.spark_home)
Default: '/usr/local/spark'
Specify where the spark files can be found.
--toree_opts= (ToreeInstall.toree_opts)
Default: ''
Specify command line arguments for Apache Toree.
--spark_opts= (ToreeInstall.spark_opts)
Default: ''
Specify command line arguments to proxy for spark config.
--interpreters= (ToreeInstall.interpreters)
Default: 'Scala'
A comma separated list of the interpreters to install. The names of the
interpreters are case sensitive.
--python_exec= (ToreeInstall.python_exec)
Default: 'python'
Specify the python executable. Defaults to "python"
--log-level= (Application.log_level)
Default: 30
Choices: (0, 10, 20, 30, 40, 50, 'DEBUG', 'INFO', 'WARN', 'ERROR', 'CRITICAL')
Set the log level by value or name.
--config= (JupyterApp.config_file)
Default: ''
Full path of a config file.
To see all available configurables, use `--help-all`
Examples
--------
jupyter toree install
jupyter toree install --spark_home=/spark/home/dir
jupyter toree install --spark_opts='--master=local[4]'
jupyter toree install --kernel_name=toree_special
jupyter toree install --toree_opts='--nosparkcontext'
jupyter toree install --interpreters=PySpark,SQL
jupyter toree install --python=python
使用 jupyter toree install --sys-prefix
是 conda 和 venv 环境的最佳选择。
我正在尝试安装 Jupyter-support for Spark in a conda environment (which I set up using http://conda.pydata.org/docs/test-drive.html) of the anaconda distribution。 我正在尝试为此使用 apache toree as Jupyter Kernel。
这是我安装 Anaconda 后所做的:
conda create --name jupyter python=3
source activate jupyter
conda install jupyter
pip install --pre toree
jupyter toree install
在我到达最后一行之前一切正常。我得到了
PermissionError: [Errno 13] Permission denied: '/usr/local/share/jupyter'
这引出了一个问题:为什么它甚至在那个目录中查找?毕竟它应该留在环境中。因此我执行
jupyter --paths
并得到
config:
/home/user/.jupyter
~/anaconda2/envs/jupyter/etc/jupyter
/usr/local/etc/jupyter
/etc/jupyter
data:
/home/user/.local/share/jupyter
~/anaconda2/envs/jupyter/share/jupyter
/usr/local/share/jupyter
/usr/share/jupyter
runtime:
/run/user/1000/jupyter
我不太确定发生了什么以及如何继续在 conda 环境 "jupyter".
中(如果可能的话只在)中获取所有内容 运行默认情况下,Jupyter 尝试将内核安装到系统范围的内核注册表中。您可以传递 --user 标志,它将使用用户内核目录。 kernelspec.py 中提供了更多详细信息。 以下命令将 toree 内核安装到用户内核中
jupyter toree install --user
您可以使用 --help
查看所有可用选项:
$ jupyter toree install --help A Jupyter kernel for talking to spark Options ------- Arguments that take values are actually convenience aliases to full Configurables, whose aliases are listed on the help line. For more information on full configurables, see '--help-all'. --user Install to the per-user kernel registry --replace Replace any existing kernel spec with this name. --sys-prefix Install to Python's sys.prefix. Useful in conda/virtual environments. --debug set log level to logging.DEBUG (maximize logging output) --kernel_name= (ToreeInstall.kernel_name) Default: 'Apache Toree' Install the kernel spec with this name. This is also used as the base of the display name in jupyter. --spark_home= (ToreeInstall.spark_home) Default: '/usr/local/spark' Specify where the spark files can be found. --toree_opts= (ToreeInstall.toree_opts) Default: '' Specify command line arguments for Apache Toree. --spark_opts= (ToreeInstall.spark_opts) Default: '' Specify command line arguments to proxy for spark config. --interpreters= (ToreeInstall.interpreters) Default: 'Scala' A comma separated list of the interpreters to install. The names of the interpreters are case sensitive. --python_exec= (ToreeInstall.python_exec) Default: 'python' Specify the python executable. Defaults to "python" --log-level= (Application.log_level) Default: 30 Choices: (0, 10, 20, 30, 40, 50, 'DEBUG', 'INFO', 'WARN', 'ERROR', 'CRITICAL') Set the log level by value or name. --config= (JupyterApp.config_file) Default: '' Full path of a config file. To see all available configurables, use `--help-all` Examples -------- jupyter toree install jupyter toree install --spark_home=/spark/home/dir jupyter toree install --spark_opts='--master=local[4]' jupyter toree install --kernel_name=toree_special jupyter toree install --toree_opts='--nosparkcontext' jupyter toree install --interpreters=PySpark,SQL jupyter toree install --python=python
使用 jupyter toree install --sys-prefix
是 conda 和 venv 环境的最佳选择。