scala IntelliJ Idea 14:无法解析符号火花

声明:本页面是StackOverFlow热门问题的中英对照翻译,遵循CC BY-SA 4.0协议,如果您需要使用它,必须同样遵循CC BY-SA许可,注明原文地址和作者信息,同时你必须将它归于原作者(不是我):StackOverFlow 原文地址: http://stackoverflow.com/questions/32265343/
Warning: these are provided under cc-by-sa 4.0 license. You are free to use/share it, But you must attribute it to the original authors (not me): StackOverFlow

提示:将鼠标放在中文语句上可以显示对应的英文。显示中英文
时间:2020-10-22 07:31:31  来源:igfitidea点击:

IntelliJ Idea 14: cannot resolve symbol spark

scalaintellij-ideaapache-sparksbt

提问by Giselle Van Dongen

I made a dependency of Spark which worked in my first project. But when I try to make a new project with Spark, my SBT does not import the external jars of org.apache.spark. Therefore IntelliJ Idea gives the error that it "cannot resolve symbol". I already tried to make a new project from scratch and use auto-import but none works. When I try to compile I get the messages that "object apache is not a member of package org". My build.sbt looks like this:

我依赖于 Spark,它在我的第一个项目中起作用。但是当我尝试用 Spark 创建一个新项目时,我的 SBT 没有导入 org.apache.spark 的外部 jars。因此,IntelliJ Idea 给出了“无法解析符号”的错误。我已经尝试从头开始创建一个新项目并使用自动导入,但没有任何效果。当我尝试编译时,我收到“对象 apache 不是包 org 的成员”的消息。我的 build.sbt 看起来像这样:

name := "hello"
version := "1.0"
scalaVersion := "2.11.7"
libraryDependencies += "org.apache.spark" % "spark-parent_2.10" % "1.4.1"

I have the impression that there might be something wrong with my SBT settings, although it already worked one time. And except for the external libraries everything is the same... I also tried to import the pom.xml file of my spark dependency but that also doesn't work. Thank you in advance!

我的印象是我的 SBT 设置可能有问题,尽管它已经工作过一次。除了外部库之外,一切都相同......我也尝试导入我的 spark 依赖项的 pom.xml 文件,但这也不起作用。先感谢您!

回答by Yash P Shah

This worked for me->

这对我有用->

name := "ProjectName"
version := "0.1"
scalaVersion := "2.11.11"

libraryDependencies ++= Seq(
  "org.apache.spark" % "spark-core_2.11" % "2.2.0",
  "org.apache.spark" % "spark-sql_2.11" % "2.2.0",
  "org.apache.spark" % "spark-mllib_2.10" % "1.1.0"
)

回答by Tobi

I use

我用

scalaVersion := "2.11.7"

libraryDependencies += "org.apache.spark" %% "spark-core" % "1.4.1"

in my build.sbtand it works for me.

在我的build.sbt,它对我有用。

回答by JARS

I had a similar problem. It seems the reason was that the build.sbtfile was specifying the wrong version of scala.

我有一个类似的问题。原因似乎是该build.sbt文件指定了错误版本的 Scala.

If you run spark-shellit'll say at some point the scala version used by Spark, e.g.

如果你运行spark-shell它会在某个时候说 Spark 使用的 scala 版本,例如

Using Scala version 2.11.8

Then I edited the line in the build.sbtfile to point to that version and it worked.

然后我编辑了build.sbt文件中的行以指向该版本并且它起作用了。

回答by cell-in

Currently spark-cassandra-connectorcompatible with Scala 2.10 and 2.11.

目前spark-cassandra-connector与 Scala 2.10 和 2.11 兼容。

It worked for me when I updated the scala version of my project like below:

当我更新我的项目的 scala 版本时,它对我有用,如下所示:

ThisBuild / scalaVersion := "2.11.12"

and I updated my dependency like:

我更新了我的依赖项,如:

libraryDependencies += "com.datastax.spark" %% "spark-cassandra-connector" % "2.4.0",

If you use "%%", sbt will add your project's binary Scala version to the artifact name.

如果您使用“%%”,sbt 会将您项目的二进制 Scala 版本添加到工件名称中。

From sbt run:

从 sbt 运行:

sbt> reload
sbt> compile

回答by Erik Schmiegelow

Your library dependecy conflicts with with the scala version you're using, you need to use 2.11 for it to work. The correct dependency would be:

您的库依赖与您使用的 scala 版本冲突,您需要使用 2.11 才能使其工作。正确的依赖是:

scalaVersion := "2.11.7"
libraryDependencies += "org.apache.spark" % "spark-core_2.11" % "1.4.1"

note that you need to change spark_parent to spark_core

请注意,您需要将 spark_parent 更改为 spark_core

回答by Shyam Gupta

name := "SparkLearning"

名称 := "SparkLearning"

version := "0.1"

版本:=“0.1”

scalaVersion := "2.12.3"

斯卡拉版本:=“2.12.3”

// additional libraries libraryDependencies += "org.apache.spark" % "spark-streaming_2.10" % "1.4.1"

// 附加库 libraryDependencies += "org.apache.spark" % "spark-streaming_2.10" % "1.4.1"