我想安装pyspark在我家里的机器上。我做到了
pip install pyspark
pip install jupyter两者似乎都运行得很好。
但是当我试着跑的时候我得到了pyspark
pyspark
Could not find valid SPARK_HOME while searching ['/home/user', '/home/user/.local/bin']应该做什么SPARK_HOME设置为?
发布于 2018-03-31 20:08:49
我刚刚遇到了同样的问题,但事实证明pip install pyspark下载在本地模式下工作良好的spark分布。Pip设置不合适SPARK_HOME。但是当我手动设置的时候,pyspark就像是一个护身符(不需要下载任何额外的包)。
$ pip3 install --user pyspark
Collecting pyspark
Downloading pyspark-2.3.0.tar.gz (211.9MB)
100% |████████████████████████████████| 211.9MB 9.4kB/s
Collecting py4j==0.10.6 (from pyspark)
Downloading py4j-0.10.6-py2.py3-none-any.whl (189kB)
100% |████████████████████████████████| 194kB 3.9MB/s
Building wheels for collected packages: pyspark
Running setup.py bdist_wheel for pyspark ... done
Stored in directory: /home/mario/.cache/pip/wheels/4f/39/ba/b4cb0280c568ed31b63dcfa0c6275f2ffe225eeff95ba198d6
Successfully built pyspark
Installing collected packages: py4j, pyspark
Successfully installed py4j-0.10.6 pyspark-2.3.0
$ PYSPARK_PYTHON=python3 SPARK_HOME=~/.local/lib/python3.5/site-packages/pyspark pyspark
Python 3.5.2 (default, Nov 23 2017, 16:37:01)
[GCC 5.4.0 20160609] on linux
Type "help", "copyright", "credits" or "license" for more information.
2018-03-31 14:02:39 WARN NativeCodeLoader:62 - Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Setting default log level to "WARN".
To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel).
Welcome to
____ __
/ __/__ ___ _____/ /__
_\ \/ _ \/ _ `/ __/ '_/
/__ / .__/\_,_/_/ /_/\_\ version 2.3.0
/_/
Using Python version 3.5.2 (default, Nov 23 2017 16:37:01)
>>>发布于 2017-09-19 03:08:46
发布于 2019-10-17 07:51:47
要安装Spark,请确保安装了Java 8或更高版本。然后转到Spark Downloads
页面选择最新的spark版本,为Hadoop预构建包并下载它。解压缩文件并移动到您的/opt (或任何文件夹,但请记住您移动它的位置)
mv spark-2.4.4-bin-hadoop2.7 /opt/spark-2.4.4然后创建一个符号链接。这样你就可以下载和使用多个spark版本。
ln -s /opt/spark-2.4.4 /opt/spark将以下内容添加到您的,.bash_profile告诉你的朋友哪里能找到斯帕克。
export SPARK_HOME=/opt/spark
export PATH=$SPARK_HOME/bin:$PATH最后,要将Spark设置为使用python3,请将以下内容添加到/opt/spark/conf/spark-env.sh文件中
export PYSPARK_PYTHON=/usr/local/bin/python3
export PYSPARK_DRIVER_PYTHON=python3https://stackoverflow.com/questions/46286436
复制相似问题