Spark Error: Could not initialize class org.apache.spark.rdd.RDDOperationScope

14,315

Solution 1

maybe you missed following lib.

 <dependency>
        <groupId>com.fasterxml.jackson.core</groupId>
        <artifactId>jackson-databind</artifactId>
        <version>2.4.4</version>
   </dependency>

Solution 2

this error message is usually accompanied by Cause: com.fasterxml.jackson.databind.JsonMappingException: Incompatible Jackson version: 2.9.8

it means there are conflict versions in the dependencies (obviously). in the Spark world, usually it is because some lib we use has dependency conflict with spark shipped one.

use coursier resolve can find out what's happening. (gradle also has debugging dependency.

cs resolve org.apache.spark:spark-core_2.11:2.4.5 | grep jackson
cs resolve com.thesamet.scalapb:scalapb-json4s_2.11:0.10.0 | grep jackson

then either build a uber jar for our application, or excluding the conflict in the build (if it is possible). e.g. build.gradle

    testCompile 'com.thesamet.scalapb:scalapb-json4s_%%:0.10.0', { exclude group: 'com.fasterxml.jackson.core' }
Share:
14,315

Related videos on Youtube

zunior
Author by

zunior

Updated on September 15, 2022

Comments

  • zunior
    zunior about 1 year

    I've created a spark standalone cluster on my laptop, then I go into an sbt console on a spark project and try to embed a spark instance as so:

    val conf = new SparkConf().setAppName("foo").setMaster(/* Spark Master URL*/)
    val sc = new SparkContext(conf)
    

    Up to there everything works fine, then I try

    sc.parallelize(Array(1,2,3))
    // and I get: java.lang.NoClassDefFoundError: Could not initialize class org.apache.spark.rdd.RDDOperationScope$
    

    How do I fix this?

    • Justin Pihony
      Justin Pihony about 8 years
      Are your Spark versions compatible?