How to use two versions of spark shell?
21,644
Solution 1
export SPARK_MAJOR_VERSION=2
You just need to give the major version 2 or 1.
$ export SPARK_MAJOR_VERSION=2
$ spark-submit --version
SPARK_MAJOR_VERSION is set to 2, using Spark2
Welcome to
____ __
/ __/__ ___ _____/ /__
_\ \/ _ \/ _ `/ __/ '_/
/___/ .__/\_,_/_/ /_/\_\ version 2.0.0.2.5.0.0-1245
Solution 2
Working this approach:
spark-shell
loads Spark 1.6
whilst typing
spark2-shell
loads Spark 2.0
Solution 3
$ SPARK_MAJOR_VERSION=2 spark-shell
Solution 4
use spark2-submit, pyspark2 or spark2-shell
Author by
Ani Menon
Updated on November 10, 2021Comments
-
Ani Menon over 2 years
I have Spark 1.6.2 and Spark 2.0 installed on my hortonworks cluster.
Both these versions are installed on a node in the Hadoop Cluster of 5 nodes.
Each time I start the
spark-shell
I get:$ spark-shell Multiple versions of Spark are installed but SPARK_MAJOR_VERSION is not set Spark1 will be picked by default
When I check the version I get:
scala> sc.version res0: String = 1.6.2
How can I start the other version(spark-shell of Spark2.0)?
-
Ani Menon over 7 yearsThis is same as the previous answer.
-
Anshul Sao over 7 yearsIts mentioned 2.0.0 in the other answer you just need to set the major version $ export SPARK_MAJOR_VERSION=2 $ spark-submit --version SPARK_MAJOR_VERSION is set to 2, using Spark2