scala 控制台错误:对象 apache 不是包 org 的成员
声明:本页面是StackOverFlow热门问题的中英对照翻译,遵循CC BY-SA 4.0协议,如果您需要使用它,必须同样遵循CC BY-SA许可,注明原文地址和作者信息,同时你必须将它归于原作者(不是我):StackOverFlow
原文地址: http://stackoverflow.com/questions/29515947/
Warning: these are provided under cc-by-sa 4.0 license. You are free to use/share it, But you must attribute it to the original authors (not me):
StackOverFlow
scala console error: object apache is not a member of package org
提问by Timothée HENRY
I am trying the code proposed here: http://spark.apache.org/docs/1.2.1/mllib-ensembles.html#classification
我正在尝试这里提出的代码:http: //spark.apache.org/docs/1.2.1/mllib-ensemble.html#classification
using the Scala console (Scala version = Scala code runner version 2.10.4), and get the following error:
使用 Scala 控制台(Scala 版本 = Scala 代码运行器版本 2.10.4),并得到以下错误:
scala> import org.apache.spark.mllib.tree.RandomForest
<console>:8: error: object apache is not a member of package org
import org.apache.spark.mllib.tree.RandomForest
^
I then followed the advice from hereand tried to build a simple self-contained application, but ran into a different problem:
然后我按照这里的建议尝试构建一个简单的自包含应用程序,但遇到了一个不同的问题:
root@sd:~/simple# sbt package
[info] Set current project to Simple Project (in build file:/root/simple/)
[info] Updating {file:/root/simple/}default-c5720e...
[info] Resolving org.scala-lang#scala-library;2.10.4 ...
[info] Resolving org.apache.spark#spark-core_2.10.4;1.2.0 ...
[warn] module not found: org.apache.spark#spark-core_2.10.4;1.2.0
[warn] ==== local: tried
[warn] /root/.ivy2/local/org.apache.spark/spark-core_2.10.4/1.2.0/ivys/ivy.xml
[warn] ==== public: tried
[warn] http://repo1.maven.org/maven2/org/apache/spark/spark-core_2.10.4/1.2.0/spark-core_2.10.4-1.2.0.pom
[warn] ::::::::::::::::::::::::::::::::::::::::::::::
[warn] :: UNRESOLVED DEPENDENCIES ::
[warn] ::::::::::::::::::::::::::::::::::::::::::::::
[warn] :: org.apache.spark#spark-core_2.10.4;1.2.0: not found
Can anyone advise what I could try?
谁能建议我可以尝试什么?
回答by prabeesh
You can find detailed steps in this posthow to write self contained Spark application using SBT in Scala. In sbt configuration file you should specify the dependent libraries.
您可以在这篇文章中找到详细步骤如何在 Scala 中使用 SBT 编写自包含的 Spark 应用程序。在 sbt 配置文件中,您应该指定依赖库。
libraryDependencies ++= Seq("org.apache.spark" % "spark-core_2.10" % "1.2.1",
"org.apache.spark" % "spark-mllib_2.10" % "1.2.1")
Then compile using following command
然后使用以下命令编译
sbt package

