Running Spark on Linux : $JAVA_HOME not set error

11,065

You need to modify the file named 'spark-config.sh' in the 'sbin'. Add your JAVA_HOME in this file, then everything will be OK.

Share:
11,065
Marc Zaharescu
Author by

Marc Zaharescu

Working as a full-stack Software Engineer for leading retailer in London. Personal website: http://www.marczaharescu.com

Updated on June 12, 2022

Comments

  • Marc Zaharescu
    Marc Zaharescu almost 2 years

    I am trying to configure spark-2.0.0-bin-hadoop2.7 on Ubuntu 16.04.1 LTS. I have set

    export JAVA_HOME=/home/marc/jdk1.8.0_101
    export SCALA_HOME=/home/marc/scala-2.11.8
    export SPARK_HOME=/home/marc/spark-2.0.0-bin-hadoop2.7
    export PATH=$PATH:$SCALA_HOME/bin:$JAVA_HOME/bin
    

    at the end of .bashrc and also included in the start-all.sh file from spark/sbin folder

    when I type echo $JAVA_HOME it gives me the correct path as /home/marc/jdk1.8.0_101

    But when I call sbin/start-all.sh

    It gives me the following error

    localhost: failed to launch org.apache.spark.deploy.worker.Worker: localhost: JAVA_HOME is not set

    I tried to follow similar topics, but I couldn't find a solution to the problem. Any help would be much appreciated.