IntelliJ Idea 14: cannot resolve symbol spark

33,446

Solution 1

This worked for me->

name := "ProjectName"
version := "0.1"
scalaVersion := "2.11.11"

libraryDependencies ++= Seq(
  "org.apache.spark" % "spark-core_2.11" % "2.2.0",
  "org.apache.spark" % "spark-sql_2.11" % "2.2.0",
  "org.apache.spark" % "spark-mllib_2.10" % "1.1.0"
)

Solution 2

I use

scalaVersion := "2.11.7"

libraryDependencies += "org.apache.spark" %% "spark-core" % "1.4.1"

in my build.sbt and it works for me.

Solution 3

I had a similar problem. It seems the reason was that the build.sbt file was specifying the wrong version of scala.

If you run spark-shell it'll say at some point the scala version used by Spark, e.g.

Using Scala version 2.11.8

Then I edited the line in the build.sbt file to point to that version and it worked.

Solution 4

Currently spark-cassandra-connector compatible with Scala 2.10 and 2.11.

It worked for me when I updated the scala version of my project like below:

ThisBuild / scalaVersion := "2.11.12"

and I updated my dependency like:

libraryDependencies += "com.datastax.spark" %% "spark-cassandra-connector" % "2.4.0",

If you use "%%", sbt will add your project’s binary Scala version to the artifact name.

From sbt run:

sbt> reload
sbt> compile

Solution 5

Your library dependecy conflicts with with the scala version you're using, you need to use 2.11 for it to work. The correct dependency would be:

scalaVersion := "2.11.7"
libraryDependencies += "org.apache.spark" % "spark-core_2.11" % "1.4.1"

note that you need to change spark_parent to spark_core

Share:
33,446
Giselle Van Dongen
Author by

Giselle Van Dongen

Updated on December 08, 2020

Comments

  • Giselle Van Dongen
    Giselle Van Dongen over 3 years

    I made a dependency of Spark which worked in my first project. But when I try to make a new project with Spark, my SBT does not import the external jars of org.apache.spark. Therefore IntelliJ Idea gives the error that it "cannot resolve symbol". I already tried to make a new project from scratch and use auto-import but none works. When I try to compile I get the messages that "object apache is not a member of package org". My build.sbt looks like this:

    name := "hello"
    version := "1.0"
    scalaVersion := "2.11.7"
    libraryDependencies += "org.apache.spark" % "spark-parent_2.10" % "1.4.1"
    

    I have the impression that there might be something wrong with my SBT settings, although it already worked one time. And except for the external libraries everything is the same... I also tried to import the pom.xml file of my spark dependency but that also doesn't work. Thank you in advance!

  • LMeyer
    LMeyer over 8 years
    Correct. Or he can just double the first % operator and remove _2.11 (well except there's no 2.11 version for Spark-parent on central)
  • Shyam Gupta
    Shyam Gupta over 6 years
    I am using Intellij2017.2 community version and it's working for me