scala scalac 编译产生“对象 apache 不是包 org 的成员”
声明:本页面是StackOverFlow热门问题的中英对照翻译,遵循CC BY-SA 4.0协议,如果您需要使用它,必须同样遵循CC BY-SA许可,注明原文地址和作者信息,同时你必须将它归于原作者(不是我):StackOverFlow
原文地址: http://stackoverflow.com/questions/28269836/
Warning: these are provided under cc-by-sa 4.0 license. You are free to use/share it, But you must attribute it to the original authors (not me):
StackOverFlow
scalac compile yields "object apache is not a member of package org"
提问by Jiang Xiang
My code is:
我的代码是:
import org.apache.spark.SparkContext
It can run in interactive mode, but when I use scalac to compile it, I got the following error message:
它可以在交互模式下运行,但是当我使用 scalac 编译它时,我收到以下错误消息:
object apache is not a member of package org
对象 apache 不是包 org 的成员
This seems to be the problem of path, but I do not know exactly how to configure the path.
这似乎是路径的问题,但我不知道具体如何配置路径。
采纳答案by tgpfeiffer
You need to specify the path of libraries used when compiling your Scala code. This is usually not done manually, but using a build tool such as Maven or sbt. You can find a minimal sbt setup at http://spark.apache.org/docs/1.2.0/quick-start.html#self-contained-applications
您需要指定编译 Scala 代码时使用的库的路径。这通常不是手动完成的,而是使用诸如 Maven 或 sbt 之类的构建工具完成的。您可以在http://spark.apache.org/docs/1.2.0/quick-start.html#self-contained-applications找到最小的 sbt 设置
回答by WattsInABox
I had this issue because I had the wrong scope for my spark dependency. This is wrong:
我遇到这个问题是因为我的 Spark 依赖范围错误。这是错误的:
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-core_2.11</artifactId>
<version>${spark.version}</version>
<scope>test</scope> <!-- will not be available during compile phase -->
</dependency>
This will workand will not include spark in your uberjar which is what you will almost certainly want:
这将起作用,并且不会在您的 uberjar 中包含 spark,这几乎肯定是您想要的:
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-core_2.11</artifactId>
<version>${spark.version}</version>
<scope>provided</scope>
</dependency>
回答by B-Tron of the Autobots
One easy way (if you're using the Play Framework) is to look up the LibraryDependacyin the Maven Repository, choose the version, choose SBTand then add it to the bottom of your project/build.sbtfile, like so:
一种简单的方法(如果您使用的是 Play 框架)是LibraryDependacy在Maven Repository 中查找,选择版本,选择 SBT,然后将其添加到project/build.sbt文件底部,如下所示:
// https://mvnrepository.com/artifact/org.apache.spark/spark-core
libraryDependencies += "org.apache.spark" %% "spark-core" % "2.3.2"
Afterwards, you'll want to enter reloadinto the sbt console and then compile. This can feel a little foreign if you're coming from pipor jsbut the Maven Repo is your friend.
之后,您需要进入reloadsbt 控制台,然后compile. 如果您来自Maven Repopip或者jsMaven Repo 是您的朋友,那么这可能会感觉有点陌生。
回答by Rahul Goyal
I was facing this issue in the sbt interactivesession.
我在sbt interactive会议中遇到了这个问题。
Resolved the is by simple executing reloadin the session.
通过reload在会话中简单执行来解决问题。
Hope this helps!
希望这可以帮助!
回答by ramana mavuluri
I have had the same issue when running scala wordcount program on spark frame work. I was using eclipse as IDE and maven as build tool. I just removed the scope of spark frame work in POM file from "test"-->"provided", like below. It worked. provided
在 Spark 框架上运行 scala wordcount 程序时,我遇到了同样的问题。我使用 eclipse 作为 IDE,使用 maven 作为构建工具。我刚刚从“测试”-->“提供”中删除了 POM 文件中的火花框架工作范围,如下所示。有效。假如

