Running Spark on Linux : $JAVA_HOME not set error
You need to modify the file named 'spark-config.sh' in the 'sbin'. Add your JAVA_HOME in this file, then everything will be OK.
Marc Zaharescu
Working as a full-stack Software Engineer for leading retailer in London. Personal website: http://www.marczaharescu.com
Updated on June 12, 2022Comments
-
Marc Zaharescu almost 2 years
I am trying to configure
spark-2.0.0-bin-hadoop2.7
onUbuntu 16.04.1 LTS
. I have setexport JAVA_HOME=/home/marc/jdk1.8.0_101 export SCALA_HOME=/home/marc/scala-2.11.8 export SPARK_HOME=/home/marc/spark-2.0.0-bin-hadoop2.7 export PATH=$PATH:$SCALA_HOME/bin:$JAVA_HOME/bin
at the end of
.bashrc
and also included in thestart-all.sh
file from spark/sbin folderwhen I type
echo $JAVA_HOME
it gives me the correct path as/home/marc/jdk1.8.0_101
But when I call
sbin/start-all.sh
It gives me the following error
localhost: failed to launch org.apache.spark.deploy.worker.Worker: localhost: JAVA_HOME is not set
I tried to follow similar topics, but I couldn't find a solution to the problem. Any help would be much appreciated.