我已经下载了spark-2.1.0-bin-without-hadoop
,它位于以下目录中:
~/Desktop/ahajib/opt/spark-2.1.0-bin-without-hadoop
当我进入该目录然后bin并尝试运行pyspark时,我收到以下错误:
/usr/local/bin/pyspark: line 24: ~/Desktop/ahajib/opt/spark-2.1.0-bin-without-hadoop/bin/load-spark-env.sh: No such file or directory
/Users/ahajibagheri/Desktop/ahajib/opt/spark-2.1.0-bin-without-hadoop/bin/spark-class: line 24: ~/Desktop/ahajib/opt/spark-2.1.0-bin-without-hadoop/bin/load-spark-env.sh: No such file or directory
Failed to find Spark jars directory (~/Desktop/ahajib/opt/spark-2.1.0-bin-without-hadoop/assembly/target/scala-/jars).
You need to build Spark with the target "package" before running this program.
我已经设置了我的JAVA_HOME和SPARK_HOME:
$JAVA_HOME
/Library/Java/JavaVirtualMachines/jdk1.8.0_131.jdk/Contents/Home
echo $SPARK_HOME
~/Desktop/ahajib/opt/spark-2.1.0-bin-without-hadoop
我在macOS Sierra 10.12.6上运行一切。关于这个问题的任何帮助将不胜感激。如果我遗漏了某些内容,请告诉我,以便我可以相应地更新问题。
谢谢
我遇到过同样的问题。要修复它,我必须定义SPARK_HOME
没有主目录的快捷方式(~
)。我认为在你的情况下它应该是这样的:
export SPARK_HOME="/Users/ahajibagheri/Desktop/ahajib/opt/spark-2.1.0-bin-without-hadoop"