scala.reflect.internal.MissingRequirementError:未找到编译器镜像中的对象 java.lang.Object

声明:本页面是StackOverFlow热门问题的中英对照翻译,遵循CC BY-SA 4.0协议,如果您需要使用它,必须同样遵循CC BY-SA许可,注明原文地址和作者信息,同时你必须将它归于原作者(不是我):StackOverFlow 原文地址: http://stackoverflow.com/questions/39791996/
Warning: these are provided under cc-by-sa 4.0 license. You are free to use/share it, But you must attribute it to the original authors (not me): StackOverFlow

提示:将鼠标放在中文语句上可以显示对应的英文。显示中英文
时间:2020-10-22 08:41:13  来源:igfitidea点击:

scala.reflect.internal.MissingRequirementError: object java.lang.Object in compiler mirror not found

scalaapache-sparkbigdata

提问by Elsayed

I'am trying to build spark streaming application using sbt package,I can't discover what's the reason of this error.

我正在尝试使用 sbt 包构建 Spark 流应用程序,我无法发现此错误的原因是什么。

this is some thing of the error

这是错误的一些事情

scala.reflect.internal.MissingRequirementError: object java.lang.Object in compiler mirror not found. at scala.reflect.internal.MissingRequirementError$.signal(MissingRequirementError.scala:16) at scala.reflect.internal.MissingRequirementError$.notFound(MissingRequirementError.scala:17) at scala.reflect.internal.Mirrors$RootsBase.getModuleOrClass(Mirrors.scala:48) at scala.reflect.internal.Mirrors$RootsBase.getModuleOrClass(Mirrors.scala:40) at scala.reflect.internal.Mirrors$RootsBase.getModuleOrClass(Mirrors.scala:40)

scala.reflect.internal.MissingRequirementError:未找到编译器镜像中的对象 java.lang.Object。在 scala.reflect.internal.MissingRequirementError$.signal(MissingRequirementError.scala:16) 在 scala.reflect.internal.MissingRequirementError$.notFound(MissingRequirementError.scala:17) 在 scala.reflect.internal.Mirrors$RootsBase.getModuleOrClass( .scala:48) 在 scala.reflect.internal.Mirrors$RootsBase.getModuleOrClass(Mirrors.scala:40) 在 scala.reflect.internal.Mirrors$RootsBase.getModuleOrClass(Mirrors.scala:40)

and here is the code

这是代码

import org.apache.spark.SparkContext
import org.apache.spark._
import org.apache.spark.streaming._
import org.apache.spark.streaming.twitter._
import twitter4j.Status
object TrendingHashTags {
def main(args: Array[String]): Unit = {
val Array(consumerKey, consumerSecret, accessToken, accessTokenSecret,
lang, batchInterval, minThreshold, showCount ) = args.take(8)
val filters = args.takeRight(args.length - 8)
System.setProperty("twitter4j.oauth.consumerKey", consumerKey)
System.setProperty("twitter4j.oauth.consumerSecret", consumerSecret)
System.setProperty("twitter4j.oauth.accessToken", accessToken)
System.setProperty("twitter4j.oauth.accessTokenSecret", accessTokenSecret)
val conf = new SparkConf().setAppName("TrendingHashTags")
val ssc = new StreamingContext(conf, Seconds(batchInterval.toInt))
val tweets = TwitterUtils.createStream(ssc, None, filters)
val tweetsFilteredByLang = tweets.filter{tweet => tweet.getLang() == lang}
val statuses = tweetsFilteredByLang.map{ tweet => tweet.getText()}
val words = statuses.flatMap{status => status.split("""\s+""")}
val hashTags = words.filter{word => word.startsWith("#")}
val hashTagPairs = hashTags.map{hashtag => (hashtag, 1)}
val tagsWithCounts = hashTagPairs.updateStateByKey(
(counts: Seq[Int], prevCount: Option[Int]) =>
prevCount.map{c => c + counts.sum}.orElse{Some(counts.sum)}
)
val topHashTags = tagsWithCounts.filter{ case(t, c) =>
c > minThreshold.toInt
}
val sortedTopHashTags = topHashTags.transform{ rdd =>
rdd.sortBy({case(w, c) => c}, false)
}
sortedTopHashTags.print(showCount.toInt)
ssc.start()
ssc.awaitTermination()
}
}

回答by Elsayed

I solved this issue ,I found that I used java 9 that isn't compatible with scala version so I migrated from java 9 into java 8.

我解决了这个问题,我发现我使用了与 scala 版本不兼容的 java 9,所以我从 java 9 迁移到 java 8。