scala 在 Intellij 中运行 Spark 时出错:“object apache is not a member of package org”

声明:本页面是StackOverFlow热门问题的中英对照翻译,遵循CC BY-SA 4.0协议,如果您需要使用它,必须同样遵循CC BY-SA许可,注明原文地址和作者信息,同时你必须将它归于原作者(不是我):StackOverFlow 原文地址: http://stackoverflow.com/questions/43134572/
Warning: these are provided under cc-by-sa 4.0 license. You are free to use/share it, But you must attribute it to the original authors (not me): StackOverFlow

提示:将鼠标放在中文语句上可以显示对应的英文。显示中英文
时间:2020-10-22 09:10:06  来源:igfitidea点击:

Error in running Spark in Intellij : "object apache is not a member of package org"

scalaapache-sparkintellij-14

提问by Learner

I am running a Spark program on Intellij and getting the below error : "object apache is not a member of package org".

我在 Intellij 上运行 Spark 程序并收到以下错误:“object apache is not a member of package org”。

I have used these import statement in the code :

我在代码中使用了这些导入语句:

import org.apache.spark.SparkContext  
import org.apache.spark.SparkContext._  
import org.apache.spark.SparkConf

The above import statement is not running on sbt prompt too. The corresponding lib appears to be missing but I am not sure how to copy the same and at which path.

上面的导入语句也没有在 sbt 提示符下运行。相应的库似乎丢失了,但我不确定如何复制相同的库以及在哪个路径上复制。

采纳答案by Vidya

Make sure you have entries like this in SBT:

确保您在 SBT 中有这样的条目:

scalaVersion := "2.11.8"
libraryDependencies ++= Seq(
  "org.apache.spark" %% "spark-core" % "2.1.0", 
  "org.apache.spark" %% "spark-sql" % "2.1.0" 
)

Then make sure IntelliJ knows about these libraries by either enabling "auto-import" or doing it manually by clicking the refresh-looking button on the SBT panel.

然后通过启用“自动导入”或通过单击 SBT 面板上的刷新按钮手动执行,确保 IntelliJ 了解这些库。