线程“main”中的异常 java.lang.NoSuchMethodError: scala.Product.$init$(Lscala/Product;)

声明:本页面是StackOverFlow热门问题的中英对照翻译,遵循CC BY-SA 4.0协议,如果您需要使用它,必须同样遵循CC BY-SA许可,注明原文地址和作者信息,同时你必须将它归于原作者(不是我):StackOverFlow 原文地址: http://stackoverflow.com/questions/46293697/
Warning: these are provided under cc-by-sa 4.0 license. You are free to use/share it, But you must attribute it to the original authors (not me): StackOverFlow

提示:将鼠标放在中文语句上可以显示对应的英文。显示中英文
时间:2020-10-22 09:26:27  来源:igfitidea点击:

Exception in thread "main" java.lang.NoSuchMethodError: scala.Product.$init$(Lscala/Product;)

scalaapache-sparkintellij-ideasbt

提问by TheShark

Any reason why I get this error ? Initially the IDE plugin for Scala was 2.12.3. But since I'm working with Spark 2.2.0, I manually changed it to Scala 2.11.11.

我收到此错误的任何原因?最初用于 Scala 的 IDE 插件是 2.12.3。但由于我使用的是 Spark 2.2.0,我手动将其更改为 Scala 2.11.11。

Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
17/09/19 12:08:19 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Exception in thread "main" java.lang.NoSuchMethodError: scala.Product.$init$(Lscala/Product;)V
    at scala.xml.Null$.<init>(Null.scala:23)
    at scala.xml.Null$.<clinit>(Null.scala)
    at org.apache.spark.ui.jobs.AllJobsPage.<init>(AllJobsPage.scala:39)
    at org.apache.spark.ui.jobs.JobsTab.<init>(JobsTab.scala:38)
    at org.apache.spark.ui.SparkUI.initialize(SparkUI.scala:67)
    at org.apache.spark.ui.SparkUI.<init>(SparkUI.scala:84)
    at org.apache.spark.ui.SparkUI$.create(SparkUI.scala:221)
    at org.apache.spark.ui.SparkUI$.createLiveUI(SparkUI.scala:163)
    at org.apache.spark.SparkContext.<init>(SparkContext.scala:452)
    at sparkEnvironment$.<init>(Ticket.scala:33)
    at sparkEnvironment$.<clinit>(Ticket.scala)
    at Ticket$.main(Ticket.scala:39)
    at Ticket.main(Ticket.scala)

回答by Akash Sethi

Make sure Spark is compatible with corresponding Scala version

确保 Spark 与对应的 Scala 版本兼容

The error is common when using Scala version 2.12series with any version of Spark offering Scala 2.11.

将 Scala 版本2.12系列与提供 Scala 的任何版本的 Spark 一起使用时,该错误很常见2.11

You can try using the 2.11series of Scala with Spark . i.e.

您可以尝试在2.11Spark 中使用Scala 系列。IE

libraryDependencies += "org.apache.spark" % "spark-core_2.11" % "2.2.0"

As you can see in this dependency spark-core_2.11is associated with scala version 2.11.

正如您在此依赖项中所见,spark-core_2.11与 scala 版本相关联2.11

That's why it's safer (more compatible) to use %%and avoid hardcoding the version of Scala in Spark dependencies. Let the tool resolve the required Scala version for you automatically as follows:

这就是为什么%%在 Spark 依赖项中使用和避免硬编码 Scala 版本更安全(更兼容)的原因。让该工具自动为您解析所需的 Scala 版本,如下所示:

libraryDependencies += "org.apache.spark" %% "spark-core" % "2.2.0"

The above declaration will automatically infer the scala version.

上述声明将自动推断 Scala 版本。