eclipse(set with scala envirnment) : object apache is not a member of package org

10,705

Solution 1

Adding spark-core jar in your classpath should resolve your issue. Also if you are using some build tools like Maven or Gradle (if not then you should because spark-core has lot many dependencies and you would keep getting such problem for different jars), try to use Eclipse task provided by these tools to properly set classpath in your project.

Solution 2

I was also receiving the same error, in my case it was compatibility issue. As Spark 2.2.1 is not compatible with Scala 2.12(it is compatible with 2.11.8) and my IDE was supporting Scala 2.12.3. I resolved my error by

1) Importing the jar files from the basic folder of Spark. During the installation of Spark in our C drive we have a folder named Spark which contains Jars folder in it. In this folder one can get all the basic jar files. Goto to Eclipse right click on the project -> properties-> Java Build Path. Under 'library' category we will get an option of ADD EXTERNAL JARs.. Select this option and import all the jar files of 'jars folder'. click on Apply.

2) Again goto properties -> Scala Compiler ->Scala Installation -> Latest 2.11 bundle (dynamic)* *before selecting this option one should check the compatibility of SPARK and SCALA.

Share:
10,705
Shalini Baranwal
Author by

Shalini Baranwal

Updated on June 04, 2022

Comments

  • Shalini Baranwal
    Shalini Baranwal almost 2 years

    enter image description here

    As shown in image, its giving error when i am importing the Spark packages. Please help. When i hover there, it shows "object apache is not a member of package org". I searched on this error, it shows spark jars has not been imported. So, i imported "spark-assembly-1.4.1-hadoop2.2.0.jar" too. But still same error.Below is what i actually want to run:

     import org.apache.spark.{SparkConf, SparkContext}
     object ABC {
    
     def main(args: Array[String]){
    //Scala Main Method
    
    println("Spark Configuration")
    
    val conf = new SparkConf()
    
    conf.setAppName("My First Spark Scala Application")
    
    conf.setMaster("spark://ip-10-237-224-94:7077")
    
    println("Creating Spark Context")
    }
    }
    
  • Shalini Baranwal
    Shalini Baranwal about 8 years
    is this the jar you are referring to -> "/root/.m2/repository/org/apache/spark/spark-core_2.10/1.4.1‌​/spark-core_2.10-1.4‌​.1.jar"? Please help
  • justAbit
    justAbit about 8 years
    Yes, that's the jar.
  • user2458922
    user2458922 almost 5 years
    seems like this asnwer is for some other issue