NoClassDefFoundError:scala/Product$class
声明:本页面是StackOverFlow热门问题的中英对照翻译,遵循CC BY-SA 4.0协议,如果您需要使用它,必须同样遵循CC BY-SA许可,注明原文地址和作者信息,同时你必须将它归于原作者(不是我):StackOverFlow
原文地址: http://stackoverflow.com/questions/44387404/
Warning: these are provided under cc-by-sa 4.0 license. You are free to use/share it, But you must attribute it to the original authors (not me):
StackOverFlow
NoClassDefFoundError: scala/Product$class
提问by Dialong
I am new to scala and I am trying to create a mixed project with Scala and Java. However I am facing some issues when I run the test code. When I run the test, I am getting an error
我是 Scala 的新手,我正在尝试使用 Scala 和 Java 创建一个混合项目。但是,当我运行测试代码时,我遇到了一些问题。当我运行测试时,我收到一个错误
![[]](/static/img/viewimg.png)
![[]](/static/img/viewimg.png)
and my pom.xml as follows:
和我的 pom.xml 如下:
<properties>
<scala.version>2.12.2</scala.version>
</properties>
<dependencies>
<dependency>
<groupId>org.scala-lang</groupId>
<artifactId>scala-library</artifactId>
<version>${scala.version}</version>
</dependency>
<dependency>
<groupId>org.scala-lang</groupId>
<artifactId>scala-compiler</artifactId>
<version>${scala.version}</version>
</dependency>
<dependency>
<groupId>org.scala-lang</groupId>
<artifactId>scala-reflect</artifactId>
<version>${scala.version}</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-streaming_2.11</artifactId>
<version>2.1.1</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-streaming-kafka-0-10_2.11</artifactId>
<version>2.1.1</version>
</dependency>
</dependencies>
<build>
<plugins>
<plugin>
<groupId>org.scala-tools</groupId>
<artifactId>maven-scala-plugin</artifactId>
<version>2.15.2</version>
<executions>
<execution>
<id>compile</id>
<goals>
<goal>compile</goal>
</goals>
<phase>compile</phase>
</execution>
<execution>
<id>test-compile</id>
<goals>
<goal>testCompile</goal>
</goals>
<phase>test-compile</phase>
</execution>
<execution>
<phase>process-resources</phase>
<goals>
<goal>compile</goal>
</goals>
</execution>
</executions>
</plugin>
<plugin>
<artifactId>maven-compiler-plugin</artifactId>
<configuration>
<source>1.5</source>
<target>1.5</target>
</configuration>
</plugin>
</plugins>
</build>
My code as follows:
我的代码如下:
class BptConsumer {
def consumeLogevent(): Unit ={
val conf = new SparkConf().setMaster("local[2]").setAppName("PVStatistics");
val ssc = new StreamingContext(conf,Seconds(5));
val kafkaParams = Map[String, Object](
"bootstrap.servers" -> "172.20.13.196:9092",
"key.deserializer" -> classOf[StringDeserializer],
"value.deserializer" -> classOf[StringDeserializer],
"group.id" -> "1",
"auto.offset.reset" -> "latest",
"enable.auto.commit" -> (false: java.lang.Boolean)
)
val topics = Array("fd-blogs-tst")
val stream = KafkaUtils.createDirectStream[String, String](
ssc,
PreferConsistent,
Subscribe[String, String](topics, kafkaParams)
)
/*val rdd = stream.transform(x=>RDD[String]);*/
val lines = stream.map(record => (record.key,record.value))
lines.print();
ssc.start();
ssc.awaitTermination();
}
}
Could someone help me out in finding the issue?
有人可以帮我找出问题吗?
回答by Jeffrey Chung
You're using Scala 2.12.2 with Spark libraries that are built with Scala 2.11. Change your Scala version to a 2.11 release:
您将 Scala 2.12.2 与使用 Scala 2.11 构建的 Spark 库一起使用。将您的 Scala 版本更改为 2.11 版本:
<properties>
<scala.version>2.11.11</scala.version>
</properties>

