eclipse Spark2.2.1 不兼容 Jackson 版本 2.8.8
声明:本页面是StackOverFlow热门问题的中英对照翻译,遵循CC BY-SA 4.0协议,如果您需要使用它,必须同样遵循CC BY-SA许可,注明原文地址和作者信息,同时你必须将它归于原作者(不是我):StackOverFlow
原文地址: http://stackoverflow.com/questions/47951910/
Warning: these are provided under cc-by-sa 4.0 license. You are free to use/share it, But you must attribute it to the original authors (not me):
StackOverFlow
Spark2.2.1 incompatible Hymanson version 2.8.8
提问by Fobi
My configuration is:
我的配置是:
- Scala 2.11 (plugin Scala IDE)
- Eclipse Neon.3 Release (4.6.3)
- Windows 7 64bit
- Scala 2.11(插件 Scala IDE)
- Eclipse Neon.3 发布 (4.6.3)
- 视窗 7 64 位
I want run this simple scala code (Esempio.scala):
我想运行这个简单的 Scala 代码 (Esempio.scala):
package it.scala
// importo packages di Spark
import org.apache.spark.SparkContext
import org.apache.spark.SparkConf
object Wordcount {
def main(args: Array[String]) {
val inputs: Array[String] = new Array[String](2)
inputs(0) = "C:\Users\FobiDell\Desktop\input"
inputs(1) = "C:\Users\FobiDell\Desktop\output"
// oggetto SparkConf per settare i parametri sulla propria applicazione
// da fornire poi al cluster manager scelto (Yarn, Mesos o Standalone).
val conf = new SparkConf()
conf.setAppName("Smartphone Addiction")
conf.setMaster("local")
// oggetto SparkContext per connessione al cluster manager scelto
val sc = new SparkContext(conf)
//Read file and create RDD
val rawData = sc.textFile(inputs(0))
//convert the lines into words using flatMap operation
val words = rawData.flatMap(line => line.split(" "))
//count the individual words using map and reduceByKey operation
val wordCount = words.map(word => (word, 1)).reduceByKey(_ + _)
//Save the result
wordCount.saveAsTextFile(inputs(1))
//stop the spark context
sc.stop
}
}
So, if I use the Spark-shell everything is ok otherwise, from Eclipse IDE, if I select the file (Esempio.scala) and run it via Run->Run as->Scala application, I obtain this Exception:
因此,如果我使用 Spark-shell 一切正常,否则从 Eclipse IDE 中,如果我选择文件 (Esempio.scala) 并通过 Run->Run as->Scala 应用程序运行它,我将获得此异常:
Exception in thread "main" java.lang.ExceptionInInitializerError
at org.apache.spark.SparkContext.withScope(SparkContext.scala:701)
at org.apache.spark.SparkContext.textFile(SparkContext.scala:830)
at it.scala.Wordcount$.main(Esempio.scala:47)
at it.scala.Wordcount.main(Esempio.scala)
Caused by: com.fasterxml.Hymanson.databind.JsonMappingException: Incompatible Hymanson version: 2.8.8
at com.fasterxml.Hymanson.module.scala.HymansonModule$class.setupModule(HymansonModule.scala:64)
at com.fasterxml.Hymanson.module.scala.DefaultScalaModule.setupModule(DefaultScalaModule.scala:19)
at com.fasterxml.Hymanson.databind.ObjectMapper.registerModule(ObjectMapper.java:745)
at org.apache.spark.rdd.RDDOperationScope$.<init>(RDDOperationScope.scala:82)
at org.apache.spark.rdd.RDDOperationScope$.<clinit>(RDDOperationScope.scala)
... 4 more
My pom.xml file is:
我的 pom.xml 文件是:
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<groupId>it.hgfhgf.xhgfghf</groupId>
<artifactId>progetto</artifactId>
<version>0.0.1-SNAPSHOT</version>
<packaging>jar</packaging>
<name>progetto</name>
<url>http://maven.apache.org</url>
<properties>
<project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
</properties>
<dependencies>
<dependency>
<groupId>junit</groupId>
<artifactId>junit</artifactId>
<version>3.8.1</version>
<scope>test</scope>
</dependency>
<!-- Neo4j JDBC DRIVER -->
<dependency>
<groupId>org.neo4j</groupId>
<artifactId>neo4j-jdbc-driver</artifactId>
<version>3.1.0</version>
</dependency>
<!-- Scala -->
<dependency>
<groupId>org.scala-lang</groupId>
<artifactId>scala-library</artifactId>
<version>2.11.11</version>
</dependency>
<!-- Spark -->
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-core_2.11</artifactId>
<version>2.2.1</version>
</dependency>
</dependencies>
</project>
I noticed that the .jar files that are into spark-2.2.1-bin-hadoop2.7/jars directory are:
我注意到 spark-2.2.1-bin-hadoop2.7/jars 目录中的 .jar 文件是:
- Hymanson-core-2.6.5.jar
- Hymanson-databind-2.6.5.jar
- Hymanson-module-paranamer-2.6.5.jar
- Hymanson-module-scala_2.11-2.6.5.jar
- Hymanson-annotations-2.6.5.jar
- Hyman逊核心2.6.5.jar
- Hyman逊-数据绑定-2.6.5.jar
- Hymanson-module-paranamer-2.6.5.jar
- Hymanson-module-scala_2.11-2.6.5.jar
- Hymanson-annotations-2.6.5.jar
Can anyone explain to me in simple terms what this exception is and how can it be resolved?
任何人都可以简单地向我解释这个异常是什么以及如何解决它?
回答by subodh
Spark 2.x contains the Hymanson 2.6.5
and neo4j-jdbc-driver
uses Hymanson 2.8.8
version, here the dependency conflict between two different version of Hymanson library.
That's why you are getting this Incompatible Hymanson version: 2.8.8
error.
Spark 2.x 包含Hymanson 2.6.5
和neo4j-jdbc-driver
使用Hymanson 2.8.8
版本,这里是两个不同版本的Hymanson 库之间的依赖冲突。这就是您收到此Incompatible Hymanson version: 2.8.8
错误的原因。
Try to override the dependency version for these[below] modules inside your pom.xml
and see if works,
尝试覆盖这些[下面]模块中的依赖版本pom.xml
,看看是否有效,
- Hymanson-core
- Hymanson-databind
- Hymanson-module-scala_2.x
- Hyman逊核心
- Hyman逊数据绑定
- Hymanson-module-scala_2.x
or try adding below dependency into your pom.xml
或尝试将以下依赖项添加到您的 pom.xml
<dependency>
<groupId>com.fasterxml.Hymanson.module</groupId>
<artifactId>Hymanson-module-scala_2.11</artifactId>
<version>2.8.8</version>
</dependency>
回答by David Urry
Scala version 2.1.1 works with Hymanson 2.6.5. Use the following:
Scala 2.1.1 版适用于 Hymanson 2.6.5。使用以下内容:
<dependency>
<groupId>com.fasterxml.Hymanson.core</groupId>
<artifactId>Hymanson-databind</artifactId>
<version>2.6.5</version>
</dependency>
回答by Lorenz Bernauer
I did run into the same version conflict of Hymanson. In addition to override Hymanson-core, Hymanson-databind, Hymanson-module-scala_2.x, I also defined Hymanson-annotations in my pom.xml, which solved the conflict.
我确实遇到了与Hyman逊相同的版本冲突。除了覆盖Hymanson-core、Hymanson-databind、Hymanson-module-scala_2.x之外,我还在我的pom.xml中定义了Hymanson-annotations,解决了冲突。
回答by Friggles
Not sure if this helps anyone whos had the problem with an sbt project thats using scala 2.12. Putting in Hymanson-module-scala_2.11doesn't quite work. There a single version of Hymanson-module-scala 2.6.7 that has a scala 2.12 build
不确定这是否对使用 scala 2.12 的 sbt 项目有问题的人有所帮助。放入Hymanson-module-scala_2.11不太好用。有一个单一版本的 Hymanson-module-scala 2.6.7,它有一个 Scala 2.12 版本
Following line in build.sbt worked
build.sbt 中的以下行有效
dependencyOverrides ++= {
Seq(
"com.fasterxml.Hymanson.module" %% "Hymanson-module-scala" % "2.6.7.1",
"com.fasterxml.Hymanson.core" % "Hymanson-databind" % "2.6.7",
"com.fasterxml.Hymanson.core" % "Hymanson-core" % "2.6.7"
)
}
This fixed the problem for spark 2.4.5
这解决了 spark 2.4.5 的问题
回答by Ankur Srivastava
Below is the combination that worked for me .
aws-java-sdk-1.7.4.jar
hadoop-aws-2.7.3.jar
joda-time-2.9.6.jar
hadoop-client-2.7.3-sources.jar
hadoop-client-2.7.3.jar
hadoop-client-2.6.0-javadoc.jar
hadoop-client-2.6.0.jar
jets3t-0.9.4.jar
Hymanson-core-2.10.0.jar
Hymanson-databind-2.8.6.jar
Hymanson-module-scala_2.11-2.8.5.jar
Hymanson-annotations-2.8.7.jar