Unable to launch Spark

6,864

I'm unsure whether Spark will work with Java 7, but if not the obvious solution is to install Java 8:

sudo add-apt-repository ppa:webupd8team/java
sudo apt-get update
sudo apt-get install oracle-java8-installer

Answering "yes" in the correct spots should get you Java 8 as default, otherwise

sudo update-java-alternatives -s java-8-oracle

will do the trick.

Update: Having said that, if you want to run with OpenJDK 7, you've got to find out where JAVA_HOME is set wrongly as per

gsamaras@gsamaras:/usr/lib/jvm$ $JAVA_HOME
bash: /usr/lib/jvm/java-8-oracle: No such file or directory

Since you try to correct that in .profile (you did hash -r or re-login?) you might want to check load-spark-env.sh or other scripts that are executed prior to launching Spark proper.

Cheers,

Share:
6,864
gsamaras
Author by

gsamaras

Yahoo! Machine Learning and Computer Vision team, San Francisco, California. Masters in Data Science. Received Stackoverflow Swag, Good Samaritan SO swag and "10 years Stackoverflow" Swag x2! In Top 10 users of my country.

Updated on September 18, 2022

Comments

  • gsamaras
    gsamaras over 1 year

    I followed this answer. I am getting this error:

    spark-class: line 86: /usr/lib/jvm/java-8-oracle/bin/java: No such file or directory

    And to my surprise, I have:

    gsamaras@gsamaras:/usr/lib/jvm$ ls
    java-1.7.0-openjdk-amd64  java-7-openjdk-amd64
    gsamaras@gsamaras:/usr/lib/jvm$ $JAVA_HOME
    bash: /usr/lib/jvm/java-8-oracle: No such file or directory
    

    How to correct it?

    More information (from here):

    gsamaras@gsamaras:~$ which java
    /usr/bin/java
    gsamaras@gsamaras:~$ ls -alh /usr/bin/java
    lrwxrwxrwx 1 root root 22 Feb 10 00:54 /usr/bin/java -> /etc/alternatives/java
    gsamaras@gsamaras:~$ ls -alh /etc/alternatives/java
    lrwxrwxrwx 1 root root 46 Feb 10 00:54 /etc/alternatives/java -> /usr/lib/jvm/java-7-openjdk-amd64/jre/bin/java
    

    In the ~/.profile I had appended:

    export JAVA_HOME=/usr/lib/jvm/java-7-openjdk-amd64
    export PATH=${JAVA_HOME}/bin:${PATH}
    export HADOOP_CLASSPATH=${JAVA_HOME}/lib/tools.jar
    

    from my Hadoop experience. When I sourced that I was able to launch Spark.


    EDIT:

    gsamaras@gsamaras:~/spark-1.6.0-bin-hadoop2.6/conf$ ls
    docker.properties.template  metrics.properties.template   spark-env.sh.template
    fairscheduler.xml.template  slaves.template
    
  • gsamaras
    gsamaras about 8 years
    It seems it does working with Java 7, so I'd rather notinstalling Java 8, since it's not in the repositories.
  • Anders R. Bystrup
    Anders R. Bystrup about 8 years
    I interpret your edit as you're now able to run Spark? If so, the problem is really where JAVA_HOME is set wrongly on your box. See my update and go looking :-) grep is your friend.
  • gsamaras
    gsamaras about 8 years
    I did that (looking in the Spark directory, in conf/ for such a script but nothing. I am updating with the scripts I see. I am not sure how to use grep here.
  • gsamaras
    gsamaras about 8 years