java.lang.NoSuchMethodError: scala.Predef$.refArrayOps
声明:本页面是StackOverFlow热门问题的中英对照翻译,遵循CC BY-SA 4.0协议,如果您需要使用它,必须同样遵循CC BY-SA许可,注明原文地址和作者信息,同时你必须将它归于原作者(不是我):StackOverFlow
原文地址: http://stackoverflow.com/questions/40328948/
Warning: these are provided under cc-by-sa 4.0 license. You are free to use/share it, But you must attribute it to the original authors (not me):
StackOverFlow
java.lang.NoSuchMethodError: scala.Predef$.refArrayOps
提问by octavian
I have the following class:
我有以下课程:
import scala.util.{Success, Failure, Try}
class MyClass {
def openFile(fileName: String): Try[String] = {
Failure( new Exception("some message"))
}
def main(args: Array[String]): Unit = {
openFile(args.head)
}
}
Which has the following unit test:
其中有以下单元测试:
class MyClassTest extends org.scalatest.FunSuite {
test("pass inexistent file name") {
val myClass = new MyClass()
assert(myClass.openFile("./noFile").failed.get.getMessage == "Invalid file name")
}
}
When I run sbt testI get the following error:
当我运行时sbt test,出现以下错误:
java.lang.NoSuchMethodError: scala.Predef$.refArrayOps([Ljava/lang/Object;)Lscala/collection/mutable/ArrayOps;
at org.scalatest.tools.FriendlyParamsTranslator$.translateArguments(FriendlyParamsTranslator.scala:174)
at org.scalatest.tools.Framework.runner(Framework.scala:918)
at sbt.Defaults$$anonfun$createTestRunners.apply(Defaults.scala:533)
at sbt.Defaults$$anonfun$createTestRunners.apply(Defaults.scala:527)
at scala.collection.TraversableLike$$anonfun$map.apply(TraversableLike.scala:244)
at scala.collection.TraversableLike$$anonfun$map.apply(TraversableLike.scala:244)
at scala.collection.immutable.Map$Map1.foreach(Map.scala:109)
at scala.collection.TraversableLike$class.map(TraversableLike.scala:244)
at scala.collection.AbstractTraversable.map(Traversable.scala:105)
at sbt.Defaults$.createTestRunners(Defaults.scala:527)
at sbt.Defaults$.allTestGroupsTask(Defaults.scala:543)
at sbt.Defaults$$anonfun$testTasks.apply(Defaults.scala:410)
at sbt.Defaults$$anonfun$testTasks.apply(Defaults.scala:410)
at scala.Function8$$anonfun$tupled.apply(Function8.scala:35)
at scala.Function8$$anonfun$tupled.apply(Function8.scala:34)
at scala.Function1$$anonfun$compose.apply(Function1.scala:47)
at sbt.$tilde$greater$$anonfun$$u2219.apply(TypeFunctions.scala:40)
at sbt.std.Transform$$anon.work(System.scala:63)
at sbt.Execute$$anonfun$submit$$anonfun$apply.apply(Execute.scala:226)
at sbt.Execute$$anonfun$submit$$anonfun$apply.apply(Execute.scala:226)
at sbt.ErrorHandling$.wideConvert(ErrorHandling.scala:17)
at sbt.Execute.work(Execute.scala:235)
at sbt.Execute$$anonfun$submit.apply(Execute.scala:226)
at sbt.Execute$$anonfun$submit.apply(Execute.scala:226)
at sbt.ConcurrentRestrictions$$anon$$anonfun.apply(ConcurrentRestrictions.scala:159)
at sbt.CompletionService$$anon.call(CompletionService.scala:28)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)
[error] (test:executeTests) java.lang.NoSuchMethodError: scala.Predef$.refArrayOps([Ljava/lang/Object;)Lscala/collection/mutable/ArrayOps;
Build definitions:
构建定义:
version := "1.0"
scalaVersion := "2.12.0"
// https://mvnrepository.com/artifact/org.scalatest/scalatest_2.11
libraryDependencies += "org.scalatest" % "scalatest_2.11" % "3.0.0"
I can't figure out what causes this. My class and unit test seem simple enough. Any ideas?
我无法弄清楚是什么原因造成的。我的课程和单元测试看起来很简单。有任何想法吗?
采纳答案by Alexey Romanov
scalatest_2.11is the version of ScalaTest compatible only with Scala 2.11.x. Write libraryDependencies += "org.scalatest" %% "scalatest" % "3.0.0" % "test"(note %%) instead to pick the correct version automatically and switch to Scala 2.11.8 until scalatest_2.12is released (it should be very soon). See http://www.scala-sbt.org/0.13/docs/Cross-Build.htmlfor more.
scalatest_2.11是仅与 Scala 2.11.x 兼容的 ScalaTest 版本。改写libraryDependencies += "org.scalatest" %% "scalatest" % "3.0.0" % "test"(注意%%)以自动选择正确的版本并切换到 Scala 2.11.8 直到scalatest_2.12发布(应该很快)。有关更多信息,请参阅http://www.scala-sbt.org/0.13/docs/Cross-Build.html。
回答by Anton Tkachov
I had SDK in global libraries with a different version of Scala(IntelliJ IDEA).
File -> Project Structure -> Global libraries -> Remove SDK -> Rebuild.It fixed the Exception for me.
我在具有不同版本 Scala(IntelliJ IDEA)的全局库中使用了 SDK。
文件 -> 项目结构 -> 全局库 -> 删除 SDK -> 重建。它为我修复了异常。
回答by HHH
I used IntelliJ, and just import the project again. I mean, close the open project and import in as Maven or SBT. Note: I select the mvn (import Maven projects automatically) It disappeared.
我使用了 IntelliJ,然后再次导入该项目。我的意思是,关闭打开的项目并作为 Maven 或 SBT 导入。注意:我选择了 mvn(自动导入 Maven 项目)它消失了。


回答by MKatleast3
In my experience, if you still get errors after matching scalatestversion and scalaversion in build.sbt, you have to think about your actual scala version that is running on your machine.
You can check it by $ scala, seeing
根据我的经验,如果在匹配scalatestversion 和scalaversion后仍然出现错误build.sbt,则必须考虑在您的机器上运行的实际 scala 版本。你可以检查它$ scala,看到
Welcome to Scala 2.12.1 (Java HotSpot(TM) 64-Bit Server VM, Java 1.8.0_121).
Type in expressions for evaluation. Or try :help.
this type of messages.
You need to match this Scala version(eg. 2.12.1here) and build.sbt's one.
Welcome to Scala 2.12.1 (Java HotSpot(TM) 64-Bit Server VM, Java 1.8.0_121).
Type in expressions for evaluation. Or try :help.
这种类型的消息。您需要匹配此 Scala 版本(例如2.12.1此处)和build.sbt's one。
回答by Community Ans
When you are using Spark, Hadoop, Scala, and java, some incompatibilities arise. You can use the version of each one that are compatible with others. I use Spark version: 2.4.1, Hadoop: 2.7, java: 9.0.1 and Scala: 2.11.12 they are compatible with each other.
当您使用Spark、Hadoop、Scala和java 时,会出现一些不兼容性。您可以使用与其他版本兼容的每个版本。我使用Spark version: 2.4.1, Hadoop: 2.7, java: 9.0.1 和 Scala: 2.11.12 它们彼此兼容。
回答by louis l
In my case, the Spark version makes it incompatible. Change to Spark 2.4.0 works for me.
就我而言,Spark 版本使其不兼容。更改为 Spark 2.4.0 对我有用。
回答by Andrew Norman
in eclipse ide the project tends to be preselected with the scala installation 'Latest 2.12 bundle (dynamic)' configuration. If you are not actually using 2.12 for your Scala project and you attempt to run your project through the IDE, then this issue will manifest itself.
在 eclipse ide 中,项目往往会通过 Scala 安装“最新 2.12 包(动态)”配置进行预选。如果您实际上并未在 Scala 项目中使用 2.12,并且您尝试通过 IDE 运行您的项目,那么此问题就会显现出来。
I've also noticed if I rebuild my eclipse project with the sbt command: "eclipse with-source" that this has the side effect of resetting the eclipse project scala installation back to the 2.12 setting (even though my build.sbt file is configured for a 2.11 version of Scala). So be on the lookout for both of those scenarios.
我还注意到,如果我使用 sbt 命令重建我的 eclipse 项目:“eclipse with-source”,这具有将 eclipse 项目 scala 安装重置回 2.12 设置的副作用(即使我的 build.sbt 文件已配置)对于 2.11 版本的 Scala)。因此,请注意这两种情况。
回答by Yousef Irman
Try adding the following line to your build.sbt
尝试将以下行添加到您的build.sbt
libraryDependencies += "org.scalatest" %% "scalatest" % "3.0.1" % "test"
your build.sbtshould be like this:
你的build.sbt应该是这样的:
libraryDependencies += "org.scalactic" %% "scalactic" % "3.0.1"
libraryDependencies += "org.scalatest" %% "scalatest" % "3.0.1" % "test"
With this, the error for me is solved.
这样,我的错误就解决了。

