spark <console>:12: error: not found: value sc

26,766

Solution 1

In my case I have spark installed on local windows system and I observed the same error but it was because of below issue

Issue:Caused by: java.lang.RuntimeException: The root scratch dir: /tmp/hive on HDFS should be writable.

This was because of permission issue.I resolved it by changing the permissions using below command.Though log says "on hdfs" this is on windows system

E:\winutils\bin\winutils.exe chmod 777 E:\tmp\hive

Solution 2

It happens when your classpath is not correct. This is an open issue in Spark at the moment.

> spark-shell 

...
...
14/08/08 18:41:50 INFO SparkILoop: Created spark context..
Spark context available as sc.

scala> sc
res0: org.apache.spark.SparkContext = org.apache.spark.SparkContext@2c1c5c2e

scala> :cp /tmp
Added '/tmp'.  Your new classpath is:
...

scala> sc
<console>:8: error: not found: value sc

You may need to correct your classpath from outside the repl.

Solution 3

First check the log file after spark-shell command run whether SparkContext is initinalized as sc. if SparkContext is not initialized properly

you have to set the IP address in spark environment.

Open the env file in conf/spark.env.sh and add the below line

export SPARK_LOCAL_IP="127.0.0.1"

Solution 4

You get this error, because sc is not defined. I would try:

sc = SparkContext(appName = "foo")

Another thing that usually happens to me is not getting a Kerberos ticket in the cluster, because I forgot too.


As for the "open issue in Spark" mentioned by Solnanki, I am pretty sure this is not the case any more.

Share:
26,766
Amitesh Ranjan
Author by

Amitesh Ranjan

https://amiteshranjan.com https://www.instagram.com/_amitesh_/

Updated on October 22, 2020

Comments

  • Amitesh Ranjan
    Amitesh Ranjan over 3 years

    I wrote the following:

    val a = 1 to 10000
    val b = sc.parallelize(a)
    

    and it shows error saying:

    <console>:12: error: not found: value sc
    

    Any help?

  • gsamaras
    gsamaras almost 7 years
    Comment by another user: Response: Code above yielded a warning about SparkContext not taking any parameters. So just try: val sc = SparkContext That fixed my issue