scala java.lang.NoSuchMethodError Jackson 数据绑定和 Spark
声明:本页面是StackOverFlow热门问题的中英对照翻译,遵循CC BY-SA 4.0协议,如果您需要使用它,必须同样遵循CC BY-SA许可,注明原文地址和作者信息,同时你必须将它归于原作者(不是我):StackOverFlow
原文地址: http://stackoverflow.com/questions/30000607/
Warning: these are provided under cc-by-sa 4.0 license. You are free to use/share it, But you must attribute it to the original authors (not me):
StackOverFlow
java.lang.NoSuchMethodError Hymanson databind and Spark
提问by user1077071
I am trying to run spark-submit with Spark 1.1.0 and Hymanson 2.4.4. I have scala code which uses Hymanson to de-serialize JSON into case classes. That works just fine on its own, but when I use it with spark I get the following error:
我正在尝试使用 Spark 1.1.0 和 Hymanson 2.4.4 运行 spark-submit。我有使用 Hymanson 将 JSON 反序列化为 case 类的 Scala 代码。这本身就很好用,但是当我将它与 spark 一起使用时,出现以下错误:
15/05/01 17:50:11 ERROR Executor: Exception in task 0.0 in stage 1.0 (TID 2)
java.lang.NoSuchMethodError: com.fasterxml.Hymanson.databind.introspect.POJOPropertyBuilder.addField(Lcom/fasterxml/Hymanson/databind/introspect/AnnotatedField;Lcom/fasterxml/Hymanson/databind/PropertyName;ZZZ)V
at com.fasterxml.Hymanson.module.scala.introspect.ScalaPropertiesCollector.com$fasterxml$Hymanson$module$scala$introspect$ScalaPropertiesCollector$$_addField(ScalaPropertiesCollector.scala:109)
at com.fasterxml.Hymanson.module.scala.introspect.ScalaPropertiesCollector$$anonfun$_addFields$$anonfun$apply.apply(ScalaPropertiesCollector.scala:100)
at com.fasterxml.Hymanson.module.scala.introspect.ScalaPropertiesCollector$$anonfun$_addFields$$anonfun$apply.apply(ScalaPropertiesCollector.scala:99)
at scala.Option.foreach(Option.scala:236)
at com.fasterxml.Hymanson.module.scala.introspect.ScalaPropertiesCollector$$anonfun$_addFields.apply(ScalaPropertiesCollector.scala:99)
at com.fasterxml.Hymanson.module.scala.introspect.ScalaPropertiesCollector$$anonfun$_addFields.apply(ScalaPropertiesCollector.scala:93)
at scala.collection.GenTraversableViewLike$Filtered$$anonfun$foreach.apply(GenTraversableViewLike.scala:109)
at scala.collection.Iterator$class.foreach(Iterator.scala:727)
at scala.collection.AbstractIterator.foreach(Iterator.scala:1157)
at scala.collection.IterableLike$class.foreach(IterableLike.scala:72)
at scala.collection.SeqLike$$anon.foreach(SeqLike.scala:635)
at scala.collection.GenTraversableViewLike$Filtered$class.foreach(GenTraversableViewLike.scala:108)
at scala.collection.SeqViewLike$$anon.foreach(SeqViewLike.scala:80)
at com.fasterxml.Hymanson.module.scala.introspect.ScalaPropertiesCollector._addFields(ScalaPropertiesCollector.scala:93)
Here is my build.sbt:
这是我的 build.sbt:
//scalaVersion in ThisBuild := "2.11.4"
scalaVersion in ThisBuild := "2.10.5"
retrieveManaged := true
libraryDependencies += "org.scala-lang" % "scala-reflect" % scalaVersion.value
libraryDependencies ++= Seq(
"junit" % "junit" % "4.12" % "test",
"org.scalatest" %% "scalatest" % "2.2.4" % "test",
"org.mockito" % "mockito-core" % "1.9.5",
"org.specs2" %% "specs2" % "2.1.1" % "test",
"org.scalatest" %% "scalatest" % "2.2.4" % "test"
)
libraryDependencies ++= Seq(
"org.apache.hadoop" % "hadoop-core" % "0.20.2",
"org.apache.hbase" % "hbase" % "0.94.6"
)
//libraryDependencies += "org.apache.spark" % "spark-core_2.10" % "1.3.0"
libraryDependencies += "org.apache.spark" % "spark-core_2.10" % "1.1.0"
libraryDependencies += "com.fasterxml.Hymanson.module" %% "Hymanson-module-scala" % "2.4.4"
//libraryDependencies += "com.fasterxml.Hymanson.module" %% "Hymanson-module-scala" % "2.3.1"
//libraryDependencies += "com.fasterxml.Hymanson.module" %% "Hymanson-module-scala" % "2.5.0"
libraryDependencies += "com.typesafe" % "config" % "1.2.1"
resolvers += Resolver.mavenLocal
As you can see, I have tried many different versions of Hymanson.
如您所见,我尝试了许多不同版本的 Hymanson。
Here is the shell script I use to run spark submit:
这是我用来运行 spark submit 的 shell 脚本:
#!/bin/bash
sbt package
CLASS=com.org.test.spark.test.SparkTest
SPARKDIR=/Users/user/Desktop/
#SPARKVERSION=1.3.0
SPARKVERSION=1.1.0
SPARK="$SPARKDIR/spark-$SPARKVERSION/bin/spark-submit"
jar_Hymanson=/Users/user/scala_projects/lib_managed/bundles/com.fasterxml.Hymanson.module/Hymanson-module-scala_2.10/Hymanson-module-scala_2.10-2.4.4.jar
"$SPARK" \
--class "$CLASS" \
--jars $jar_Hymanson \
--master local[4] \
/Users/user/scala_projects/target/scala-2.10/spark_project_2.10-0.1-SNAPSHOT.jar \
print /Users/user/test.json
I use --jarsto the path of the Hymanson jar to the spark-submit command. I have even tried different versions of Spark. I have also even specified the paths for the Hymanson jars databind, annotations, etc but that didn't resolve the issue. Any help would be appreciated. Thank you
我使用--jarsHymanson jar 到 spark-submit 命令的路径。我什至尝试过不同版本的 Spark。我什至还指定了 Hymanson jars 数据绑定、注释等的路径,但这并没有解决问题。任何帮助,将不胜感激。谢谢
回答by Yuvraj Beegala
I had the same problem where my play-json jar was using Hymanson 2.3.2 and spark was using Hymanson 2.4.4.
While I was running the spark application, it was unable to find the method in Hymanson-2.3.2 and I got the same exception.
我有同样的问题,我的 play-json jar 使用 Hymanson 2.3.2 而 spark 使用 Hymanson 2.4.4。
当我运行 spark 应用程序时,它无法在 Hymanson-2.3.2 中找到该方法,并且我遇到了同样的异常。
I checked the maven dependency hierarchy for Hymanson. It displayed the version it took and which jar (Here play used 2.3.2) and as my play-json placed at first in dependency list, it took 2.3.2 version.
我检查了 Hymanson 的 maven 依赖层次结构。它显示了它使用的版本和哪个 jar(这里 play 使用了 2.3.2),并且当我的 play-json 最初放在依赖项列表中时,它使用了 2.3.2 版本。
So I tried placing the play dependency at the end of the all dependencies/after the spark dependency and it worked pretty well. It took 2.4.4 this time and version 2.3.2 is omitted.
因此,我尝试将 play 依赖项放在所有依赖项的末尾/ spark 依赖项之后,并且效果很好。这次用了2.4.4,省略了2.3.2版本。
来源:
Note that if two dependency versions are at the same depth in the dependency tree, until Maven 2.0.8 it was not defined which one would win, but since Maven 2.0.9 it's the order in the declaration that counts: the first declaration wins.
请注意,如果两个依赖版本在依赖树中处于相同的深度,则在 Maven 2.0.8 之前还没有定义哪个将获胜,但从 Maven 2.0.9 开始,声明中的顺序很重要:第一个声明获胜。
回答by Erik Schmiegelow
I just ran into the same problem with Hymanson and spark. As I was using SBT, like user1077071, I followed the following steps:
我刚刚在 Hymanson 和 spark 上遇到了同样的问题。当我使用 SBT 时,如 user1077071,我遵循以下步骤:
- Installed the excellent dependency Plugin for SBT: https://github.com/jrudolph/sbt-dependency-graph
- Discovered that in my case, play-json was depending on Hymanson 2.3
- added Hymanson 2.4. to my libraryDependencies
- 为 SBT 安装了优秀的依赖插件:https: //github.com/jrudolph/sbt-dependency-graph
- 发现在我的情况下,play-json 依赖于 Hymanson 2.3
- 添加了Hyman逊 2.4。到我的图书馆依赖
I did have to aplly that approach to mulatiple Hymanson libs though: core, annotations and databind. databind was the culprit, but the otehrs should be bumped as well to avoid clashes.
不过,我确实必须将这种方法应用于多个 Hymanson 库:核心、注释和数据绑定。databind 是罪魁祸首,但也应该碰撞其他人以避免冲突。
After that, it worked like a charm.
在那之后,它就像一个魅力。
回答by Akhil S Kamath
got java.lang.NoSuchMethodError Hymanson databind for method ...introspect.AnnotatedMember.annotations()Issue resolved by updating maven dependency for Hymanson-databindversion 2.9.0.pr3- to 2.9.1
...introspect.AnnotatedMember.annotations()通过将Hymanson-databind版本2.9.0.pr3 的maven 依赖关系更新为2.9.1解决了方法问题的java.lang.NoSuchMethodError Hymanson databind
回答by HoTicE
I had
我有
java.lang.NoSuchMethodError: com.fasterxml.Hymanson.core.JsonStreamContext.<init>(Lcom/fasterxml/Hymanson/core/JsonStreamContext;)V
error when updated com.fasterxml.Hymanson.core libraries from 2.8.9 to 2.9.1
将 com.fasterxml.Hymanson.core 库从 2.8.9 更新到 2.9.1 时出错
in my case, resolution is to look through gradle dependencies and in build.gradle exclude all conflicts:
就我而言,解决方案是查看 gradle 依赖项并在 build.gradle 中排除所有冲突:
compile('org.springframework.boot:spring-boot-starter-web:1.5.7.RELEASE') {
exclude group: "com.fasterxml.Hymanson.core"
}
compile('org.springframework.boot:spring-boot-starter-jdbc:1.5.7.RELEASE') {
exclude group: "com.fasterxml.Hymanson.core"
}
compile('com.fasterxml.Hymanson.core:Hymanson-databind:2.9.1') {
exclude module: "Hymanson-annotations"
exclude module: "Hymanson-core"
}
compile('com.fasterxml.Hymanson.core:Hymanson-annotations:2.9.1')
compile('com.fasterxml.Hymanson.core:Hymanson-core:2.9.1')
compile 'org.scala-lang:scala-library:2.12.3'
compile('com.fasterxml.Hymanson.module:Hymanson-module-scala_2.12:2.9.1') {
exclude group: "org.scala-lang"
exclude module: "Hymanson-core"
exclude module: "Hymanson-annotations"
exclude module: "Hymanson-databind"
}
回答by ForeverLearner
If you are using the latest spark version 3.0.0-preview2, the below configuration is a working build.sbt:
如果您使用的是最新的 spark 版本3.0.0-preview2,则以下配置是有效的build.sbt:
name := "scala-streams"
version := "0.1"
scalaVersion := "2.12.10"
val sparkVersion = "3.0.0-preview2"
val playVersion="2.8.1"
val HymansonVersion="2.10.1"
//override if you wish to
//dependencyOverrides += "com.fasterxml.Hymanson.core" % "Hymanson-core" % HymansonVersion
//dependencyOverrides += "com.fasterxml.Hymanson.core" % "Hymanson-databind" % HymansonVersion
libraryDependencies ++= Seq(
"org.apache.spark" %% "spark-streaming" % sparkVersion,
"org.apache.spark" %% "spark-core" % sparkVersion,
"org.apache.spark" %% "spark-sql" % sparkVersion,
"com.typesafe.play" %% "play-json" % playVersion
)
回答by chenzhongpu
The main reason I think is the you don't specify the right dependency.
我认为的主要原因是您没有指定正确的依赖项。
If you use 3-rd party library and then submit to Sparkdirectly, the better way is to use sbt-assembly(https://github.com/sbt/sbt-assembly).
如果您使用 3-rd 方库然后submit to Spark直接使用,更好的方法是使用sbt-assembly( https://github.com/sbt/sbt-assembly)。

