NoSuchMethodError when using Spark and IntelliJ

11,159

Solution 1

Change your scalaVersion to 2.11.8 and add the Spark dependency to your build.sbt:

libraryDependencies += "org.apache.spark" % "spark-core_2.11" % "2.0.2"

Solution 2

One more scenario is intellij is pointing to 2.12.4 and all the maven/sbt dependencies are 2.11.8. with scala dep verion 2.11...

I stepped back from 2.12.4 to 2.11.8 at global libraries of intellij ui. and it started working

Details :

Maven pom.xml pointing to 2.11.8 But in my Intellij... sdk is 2.12.4 in global libraries shown below. Which is causing

java.lang.NoSuchMethodError: scala.Predef$.refArrayOps([Ljava/lang/Object;)Lscala/collection/mutable/ArrayOps;

enter image description here Stepped back to 2.11.8 in Global libraries.. like below

enter image description here Thats it.. Problem solved. No more error for executing that program.

Conclusion : Maven dependencies alone should not solve the problem, along with that we have to configure scala sdk in global libraries since its error is coming while running a spark local program and error is related to Intellij run time.

Share:
11,159
lars
Author by

lars

Updated on June 19, 2022

Comments

  • lars
    lars almost 2 years

    I'm new to Scala and Spark. I've been frustrated by how hard it has been to get things to work with IntelliJ. Currently, I can't get run the code below. I'm sure it's something simple, but I can't get it to work.

    I'm trying to run:

    import org.apache.spark.{SparkConf, SparkContext}
    
    object TestScala {
      def main(args: Array[String]): Unit = {
        val conf = new SparkConf()
        conf.setAppName("Datasets Test")
        conf.setMaster("local[2]")
        val sc = new SparkContext(conf)
        println(sc)
      }
    }
    

    The error I get is:

    Exception in thread "main" java.lang.NoSuchMethodError: scala.Predef$.refArrayOps([Ljava/lang/Object;)Lscala/collection/mutable/ArrayOps;
    at org.apache.spark.util.Utils$.getCallSite(Utils.scala:1413)
    at org.apache.spark.SparkContext.<init>(SparkContext.scala:77)
    at TestScala$.main(TestScala.scala:13)
    at TestScala.main(TestScala.scala)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:498)
    at com.intellij.rt.execution.application.AppMain.main(AppMain.java:147)
    

    My build.sbt file:

    name := "sparkBook"
    
    version := "1.0"
    
    scalaVersion := "2.12.1"
    
  • Angelo Genovese
    Angelo Genovese over 7 years
    It's a bit more idiomatic to use libraryDependencies += "org.apache.spark" %% "spark-core" % "2.0.2" isn't it?
  • The_Tourist
    The_Tourist over 7 years
    @AngeloGenovese You're right, it is more idiomatic but I was a bit hesitant to suggest that because if the Scala version is changed to 2.12, the build would fail if Spark hasn't been published for that version. At least from what I've seen, this situation hasn't been easy to debug if the developer isn't aware of the %% vs % semantics.
  • seb
    seb over 4 years
    Yeah maybe, but then I run into even a bigger problem with this weird error: github.com/sbt/zinc/issues/276 where the "compiler-bridge_2.11" package could not be compiled. Scala 2.11.0, 2.11.8, 2.11.11 did not work, but 2.11.12 worked!
  • user1735921
    user1735921 over 2 years
    how can I use scale 2.12 ?