运行 spark scala 示例失败
声明:本页面是StackOverFlow热门问题的中英对照翻译,遵循CC BY-SA 4.0协议,如果您需要使用它,必须同样遵循CC BY-SA许可,注明原文地址和作者信息,同时你必须将它归于原作者(不是我):StackOverFlow
原文地址: http://stackoverflow.com/questions/26351338/
Warning: these are provided under cc-by-sa 4.0 license. You are free to use/share it, But you must attribute it to the original authors (not me):
StackOverFlow
Running spark scala example fails
提问by user2003470
I'm new to both Spark and Scala. I've created an IntelliJ Scala project with SBT and added a few lines to build.sbt.
我是 Spark 和 Scala 的新手。我已经用 SBT 创建了一个 IntelliJ Scala 项目,并在 build.sbt 中添加了几行。
name := "test-one"
version := "1.0"
scalaVersion := "2.11.2"
libraryDependencies += "org.apache.spark" % "spark-core_2.10" % "1.1.0"
My version of Scala is 2.10.4 but this problem also occurs with 2.11.2
我的 Scala 版本是 2.10.4 但这个问题也出现在 2.11.2
Exception in thread "main" java.lang.NoClassDefFoundError: scala/collection/GenTraversableOnce$class
at akka.util.Collections$EmptyImmutableSeq$.<init>(Collections.scala:15)
at akka.util.Collections$EmptyImmutableSeq$.<clinit>(Collections.scala)
at akka.japi.Util$.immutableSeq(JavaAPI.scala:209)
at akka.actor.ActorSystem$Settings.<init>(ActorSystem.scala:150)
at akka.actor.ActorSystemImpl.<init>(ActorSystem.scala:470)
at akka.actor.ActorSystem$.apply(ActorSystem.scala:111)
at akka.actor.ActorSystem$.apply(ActorSystem.scala:104)
at org.apache.spark.util.AkkaUtils$.org$apache$spark$util$AkkaUtils$$doCreateActorSystem(AkkaUtils.scala:121)
at org.apache.spark.util.AkkaUtils$$anonfun.apply(AkkaUtils.scala:54)
at org.apache.spark.util.AkkaUtils$$anonfun.apply(AkkaUtils.scala:53)
at org.apache.spark.util.Utils$$anonfun$startServiceOnPort.apply$mcVI$sp(Utils.scala:1446)
at scala.collection.immutable.Range.foreach$mVc$sp(Range.scala:160)
at org.apache.spark.util.Utils$.startServiceOnPort(Utils.scala:1442)
at org.apache.spark.util.AkkaUtils$.createActorSystem(AkkaUtils.scala:56)
at org.apache.spark.SparkEnv$.create(SparkEnv.scala:153)
at org.apache.spark.SparkContext.<init>(SparkContext.scala:203)
at TweeProcessor$.main(TweeProcessor.scala:10)
at TweeProcessor.main(TweeProcessor.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at com.intellij.rt.execution.application.AppMain.main(AppMain.java:134)
Caused by: java.lang.ClassNotFoundException: scala.collection.GenTraversableOnce$class
at java.net.URLClassLoader.run(URLClassLoader.java:366)
at java.net.URLClassLoader.run(URLClassLoader.java:355)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
... 23 more
Tried looking up online, most answers point to a mismatch between API versions and Scala version, but none are specific to Spark.
尝试在线查找,大多数答案都指出 API 版本和 Scala 版本不匹配,但没有一个是特定于 Spark 的。
回答by lmm
spark-core_2.10is built for use with 2.10.x versions of scala. You should use
spark-core_2.10专为与 2.10.x 版本的 Scala 一起使用而构建。你应该使用
libraryDependencies += "org.apache.spark" %% "spark-core" % "1.1.0"
which will select the correct _2.10 or _2.11 version for your scala version.
这将为您的 Scala 版本选择正确的 _2.10 或 _2.11 版本。
Also make sure you're compiling against the same versions of scala and spark as the ones on the cluster where you're running this.
还要确保您编译的 scala 和 spark 版本与运行它的集群上的版本相同。
回答by BlitzKrieg
Downgrade the scala version to 2.10.4
将Scala版本降级到2.10.4
name := "test-one"
version := "1.0"
//scalaVersion := "2.11.2"
scalaVersion := "2.10.4"
libraryDependencies += "org.apache.spark" % "spark-core_2.10" % "1.1.0"
回答by raisashi
This is a version compatibility issue. Spark_core 2.10 is build using scala 2.10, and your sbt file mention you are using scala 2.11. Either downgrade your scala version to 2.10 or upgrade your spark to 2.11
这是版本兼容性问题。Spark_core 2.10 是使用 scala 2.10 构建的,并且您的 sbt 文件提到您使用的是 scala 2.11。将您的 scala 版本降级到 2.10 或将您的 spark 升级到 2.11
回答by Yash P Shah
scalaVersion := "2.11.1"
libraryDependencies ++= Seq(
"org.apache.spark" % "spark-core_2.11" % "2.2.0",
"org.apache.spark" % "spark-sql_2.11" % "2.2.0"
)
This configuration worked for me.
这个配置对我有用。

