spark submit java.lang.ClassNotFoundException

16,435

Solution 1

Apparently there must have been something wrong with my project structure in general. Because I created a new project with sbt and sublime and I'm now able to use spark-submit. But this is really weird because I haven't changed anything to the default structure of a sbt-project provided in intelliJ. This is now the project structure which works like a charm:

Macbook:sp user$ find .
.
./build.sbt
./project
./project/plugin.sbt
./src
./src/main
./src/main/scala
./src/main/scala/MySimpleApp.scala

Thanks for your help!

Solution 2

You should not refer to your class by its directory path, but by its package path. Example:

/Users/_name_here/dev/spark/bin/spark-submit 
--master local[4]
--class com.example.MySimpleApp /Users/_name_here/dev/sp/target/scala-2.10/sp_2.10-0.1-SNAPSHOT.jar

From what I see you do not have MySimpleApp in any package, so just "--class MySimpleApp" should work.

Share:
16,435
Bart
Author by

Bart

Updated on July 25, 2022

Comments

  • Bart
    Bart almost 2 years

    I'm trying to run my own spark application but when I'm using the spark-submit command I get this error:

    Users/_name_here/dev/sp/target/scala-2.10/sp_2.10-0.1-SNAPSHOT.jar --stacktrace
    java.lang.ClassNotFoundException:        /Users/_name_here/dev/sp/mo/src/main/scala/MySimpleApp
        at java.lang.Class.forName0(Native Method)
        at java.lang.Class.forName(Class.java:340)
        at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:633)
        at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:169)
        at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:192)
        at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:111)
        at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
    

    I'm using following command:

    /Users/_name_here/dev/spark/bin/spark-submit 
    --class "/Users/_name_here/dev/sp/mo/src/main/scala/MySimpleApp" 
    --master local[4] /Users/_name_here/dev/sp/target/scala-2.10/sp_2.10-0.1-SNAPSHOT.jar 
    

    My build.sb looks like this:

    name := "mo"
    
    version := "1.0"
    
    scalaVersion := "2.10.4"
    
    
    libraryDependencies ++= Seq(
      "org.apache.spark"          % "spark-core_2.10"   %    "1.4.0",
      "org.postgresql"            % "postgresql"        %    "9.4-1201-jdbc41",
      "org.apache.spark"          % "spark-sql_2.10"    %    "1.4.0",
      "org.apache.spark"          % "spark-mllib_2.10"  %    "1.4.0",
      "org.tachyonproject"        % "tachyon-client"    %    "0.6.4",
      "org.postgresql"            % "postgresql"        %    "9.4-1201-jdbc41",
      "org.apache.spark"          % "spark-hive_2.10"   %    "1.4.0",
      "com.typesafe"              % "config"            %    "1.2.1"
    )
    
    resolvers += "Typesafe Repo" at "http://repo.typesafe.com/typesafe/releases/"
    

    My plugin.sbt:

    logLevel := Level.Warn
    
    resolvers += "Sonatype snapshots" at "https://oss.sonatype.org/content/repositories/snapshots/"
    
    addSbtPlugin("com.github.mpeltonen" % "sbt-idea" % "1.6.0")
    addSbtPlugin("com.eed3si9n" % "sbt-assembly"  %"0.11.2")
    

    I'm using the prebuild package from spark.apache.org. I installed sbt through brew as well as scala. Running sbt package from the spark root folder works fine and it creates the jar but using assembly doesn't work at all, maybe because its missing in the rebuild spark folder. I would appreciate any help because I'm quite new to spark. oh and btw spark is running fine within intelliJ

  • Bart
    Bart over 8 years
    unfortunately, I'm still encountering the same error message
  • TheMP
    TheMP over 8 years
    Could you paste MySimpleApp? What is its package?