scala (run-main-0) java.lang.NoSuchMethodError

声明:本页面是StackOverFlow热门问题的中英对照翻译,遵循CC BY-SA 4.0协议,如果您需要使用它,必须同样遵循CC BY-SA许可,注明原文地址和作者信息,同时你必须将它归于原作者(不是我):StackOverFlow 原文地址: http://stackoverflow.com/questions/29339005/
Warning: these are provided under cc-by-sa 4.0 license. You are free to use/share it, But you must attribute it to the original authors (not me): StackOverFlow

提示:将鼠标放在中文语句上可以显示对应的英文。显示中英文
时间:2020-10-22 07:01:12  来源:igfitidea点击:

(run-main-0) java.lang.NoSuchMethodError

scalaapache-sparksbtnosuchmethoderror

提问by zhang

I got a problem when I used sbtto run a spark job, I have finish compile, but when I run the command run, I got the problem below

当我使用sbt运行 spark 作业时出现问题,我已经完成编译,但是当我运行命令时run,出现以下问题

 [error] (run-main-0) java.lang.NoSuchMethodError: scala.collection.immutable.HashSet$.empty()Lscala/collection/immutable/HashSet;
java.lang.NoSuchMethodError: scala.collection.immutable.HashSet$.empty()Lscala/collection/immutable/HashSet;
    at akka.actor.ActorCell$.<init>(ActorCell.scala:305)
    at akka.actor.ActorCell$.<clinit>(ActorCell.scala)
    at akka.actor.RootActorPath.$div(ActorPath.scala:152)
    at akka.actor.LocalActorRefProvider.<init>(ActorRefProvider.scala:465)
    at akka.remote.RemoteActorRefProvider.<init>(RemoteActorRefProvider.scala:124)
    at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
    at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
    at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
    at java.lang.reflect.Constructor.newInstance(Constructor.java:422)
    at akka.actor.ReflectiveDynamicAccess$$anonfun$createInstanceFor.apply(DynamicAccess.scala:78)
    at scala.util.Try$.apply(Try.scala:191)

Anyone knows what should I do?

有谁知道我该怎么办?

回答by Reddevil

I met with the same error when I used scala-library-2.11 jar But when I replaced it with scala-library-2.10 jar . It runs fine

我在使用 scala-library-2.11 jar 时遇到了同样的错误但是当我用 scala-library-2.10 jar 替换它时。它运行良好

回答by Guido

It is probably caused by using incompatible versions of Scala. When I downgraded from Scala 2.11 to 2.10, I forgot to modify one package version (so one package used 2.11, the rest 2.10), resulting in having the same error.

这可能是由于使用了不兼容的 Scala 版本造成的。当我从Scala 2.11降级到2.10时,我忘记修改一个包版本(所以一个包使用了2.11,其余的2.10),导致出现同样的错误。

Note: I only had this problem when using IntelliJ.

注意:我只在使用 IntelliJ 时遇到过这个问题。

回答by Rahul J

If you are getting the error and here because you cannot run Jupiter notebooks with Spark 2.1 and Scala 2.11 below is how I was able to make it work. Assumes you installed Jupiter and toree

如果您收到错误,并且因为您无法使用下面的 Spark 2.1 和 Scala 2.11 运行 Jupiter 笔记本,那么我是如何让它工作的。假设您安装了 Jupiter 和 toree

Pre-req- Make sure Docker is running else Make fails. Make sure gpg is installed else Make fails.

Pre-req- 确保 Docker 正在运行,否则 Make 失败。确保安装了 gpg,否则 Make 失败。

Build steps -

构建步骤 -

export SPARK_HOME=/Users/<path>/spark-2.1.0-hadoop2.7/ 
git clone https://github.com/apache/incubator-toree.git 
cd incubator-toree 
make clean release APACHE_SPARK_VERSION=2.1.0 
pip install --upgrade ./dist/toree-pip/toree-0.2.0.dev1.tar.gz 
pip freeze |grep toree 
jupyter toree install --spark_home=$SPARK_HOME

========================================================================

================================================== ======================

To Start the notebook- SPARK_OPTS='--master=local[4]' jupyter notebook

启动笔记本- SPARK_OPTS='--master=local[4]' jupyter notebook

回答by P.C.

I used these versions and everything works now.

我使用了这些版本,现在一切正常。

    <dependency>
        <groupId>org.scala-lang</groupId>
        <artifactId>scala-library</artifactId>
        <version>2.11.6</version>
    </dependency>

    <dependency>
        <groupId>com.typesafe.akka</groupId>
        <artifactId>akka-actor_2.11</artifactId>
        <version>2.3.11</version>
    </dependency>

回答by Zhi Zheng

Check whether the scala version you are using corresponds to the precompiled spark version. enter image description here

检查您使用的 scala 版本是否与预编译的 spark 版本相对应。 在此处输入图片说明

回答by Andrii Abramov

The issue could be reproduced with version 2.11.8.

该问题可以通过版本重现2.11.8

By the moment, no downgrade is required. Just update scala-libraryversion to 2.12.0.

目前,不需要降级。只需将scala-library版本更新到2.12.0

回答by Willem Bressers

I've the same issue but where do i alter the scala-library version?

我有同样的问题,但我在哪里更改 scala-library 版本?

Installation (on Ubuntu 16.04):

安装(在 Ubuntu 16.04 上):

sudo apt-get install oracle-java8-installer
wget http://d3kbcqa49mib13.cloudfront.net/spark-2.0.2-bin-hadoop2.7.tgz && tar xvf spark-2.0.2-bin-hadoop2.7.tgz
pip install toree && jupyter toree install

So when I start with a notebook it tells me that I use a different scala version. But I haven't installed anything else. screenshot + scala version

所以当我开始使用笔记本时,它告诉我我使用的是不同的 Scala 版本。但我还没有安装其他任何东西。 屏幕截图 + Scala 版本

My spark jars folder contains an scala-library-2.11.8.jarfile. But how tell torree to use that (or another) file for scala.

我的 spark jars 文件夹包含一个scala-library-2.11.8.jar文件。但是如何告诉 torree 将那个(或另一个)文件用于 scala。

回答by fitha abdulla

To me both scala versions 2.11 and 2.12 dint work , downgrading to 2.10.3 worked

对我来说,Scala 版本 2.11 和 2.12 都有效,降级到 2.10.3 有效

回答by Ben Liao

I've the exactly the same problem and got it fixed by downgrading scala 2.11.8 to 2.10.6.

我遇到了完全相同的问题,并通过将 Scala 2.11.8 降级到 2.10.6 来解决它。