Spark Submit 失败,出现 java.lang.NoSuchMethodError: scala.Predef$.$conforms()Lscala/Predef$$less$colon$less;

声明:本页面是StackOverFlow热门问题的中英对照翻译,遵循CC BY-SA 4.0协议,如果您需要使用它,必须同样遵循CC BY-SA许可,注明原文地址和作者信息,同时你必须将它归于原作者(不是我):StackOverFlow 原文地址: http://stackoverflow.com/questions/30342273/
Warning: these are provided under cc-by-sa 4.0 license. You are free to use/share it, But you must attribute it to the original authors (not me): StackOverFlow

提示:将鼠标放在中文语句上可以显示对应的英文。显示中英文
时间:2020-08-11 09:28:13  来源:igfitidea点击:

Spark Submit fails with java.lang.NoSuchMethodError: scala.Predef$.$conforms()Lscala/Predef$$less$colon$less;

javamavenapache-sparkcassandra-2.0

提问by mithra

I am using spark 1.3.1 prebuild version spark-1.3.1-bin-hadoop2.6.tgz

我正在使用 spark 1.3.1 预构建版本 spark-1.3.1-bin-hadoop2.6.tgz

Exception in thread "main" java.lang.NoSuchMethodError: scala.Predef$.$conforms()Lscala/Predef$$less$colon$less; at org.apache.spark.util.Utils$.getSystemProperties(Utils.scala:1418) at org.apache.spark.SparkConf.(SparkConf.scala:58) at org.apache.spark.SparkConf.(SparkConf.scala:52) at com.zoho.zbi.Testing.test(Testing.java:43) at com.zoho.zbi.Testing.main(Testing.java:39) Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties

线程“main”中的异常 java.lang.NoSuchMethodError: scala.Predef$.$conforms()Lscala/Predef$$less$colon$less; 在 org.apache.spark.util.Utils$.getSystemProperties(Utils.scala:1418) 在 org.apache.spark.SparkConf.(SparkConf.scala:58) 在 org.apache.spark.SparkConf.(SparkConf.scala: 52) 在 com.zoho.zbi.Testing.test(Testing.java:43) 在 com.zoho.zbi.Testing.main(Testing.java:39) 使用 Spark 的默认 log4j 配置文件:org/apache/spark/log4j- defaults.properties

I am trying a simple demo app to save to cassandra

我正在尝试一个简单的演示应用程序来保存到 cassandra

SparkConf batchConf= new SparkConf()
            .setSparkHome(sparkHome)
            .setJars(jars)
            .setAppName(ZohoBIConstants.getAppName("cassandra"))//NO I18N
            .setMaster(master).set("spark.cassandra.connection.host", "localhost");

            JavaSparkContext sc = new JavaSparkContext(batchConf);
            // here we are going to save some data to Cassandra...
            List<Person> people = Arrays.asList(
                    Person.newInstance(1, "John", new Date()),
                    Person.newInstance(2, "Anna", new Date()),
                    Person.newInstance(3, "Andrew", new Date())
            );
//          Person test = Person.newInstance(1, "vini", new Date())''
            System.out.println("Inside Java API Demo : "+people);
            JavaRDD<Person> rdd = sc.parallelize(people);
            System.out.println("Inside Java API Demo rdd : "+rdd);
            javaFunctions(rdd).writerBuilder("test", "people", mapToRow(Person.class)).saveToCassandra();
            System.out.println("Stopping sc");
            sc.stop();

when i submit using spark submit its working

当我使用 spark 提交时提交其工作

bin/spark-submit --class "abc.efg.Testing" --master spark://xyz:7077 /home/test/target/uber-Cassandra-0.0.1-SNAPSHOT.jar

Here is my pom

这是我的 pom

dependencies

依赖

<dependencies>
  <!-- Scala -->
    <dependency>
        <groupId>org.scala-lang</groupId>
        <artifactId>scala-library</artifactId>
        <version>${scala.version}</version>
        <scope>compile</scope>
    </dependency>

    <dependency>
        <groupId>org.scala-lang</groupId>
        <artifactId>scala-compiler</artifactId>
        <version>${scala.version}</version>
    </dependency>
    <!-- END Scala -->
  <dependency>
    <groupId>com.google.guava</groupId>
    <artifactId>guava</artifactId>
    <version>18.0</version>
  </dependency>

  <dependency>
    <groupId>com.yammer.metrics</groupId>
    <artifactId>metrics-core</artifactId>
    <version>2.2.0</version>
  </dependency>

  <dependency>
    <groupId>junit</groupId>
    <artifactId>junit</artifactId>
    <version>3.8.1</version>
    <scope>test</scope>
  </dependency>

  <dependency>
    <groupId>javax.servlet</groupId>
    <artifactId>javax.servlet-api</artifactId>
    <version>3.1.0</version>
    <scope>provided</scope>
  </dependency>

  <dependency>
    <groupId>com.datastax.cassandra</groupId>
    <artifactId>cassandra-driver-core</artifactId>
    <version>2.1.5</version>
  </dependency>

  <dependency>
    <groupId>org.json</groupId>
    <artifactId>json</artifactId>
    <version>20090211</version>
  </dependency>
<!-- Cassandra Spark Connector dependency -->
  <dependency>
    <groupId>com.datastax.spark</groupId>
    <artifactId>spark-cassandra-connector_2.10</artifactId>
    <version>1.2.0</version>
  </dependency>
<!-- Cassandra java Connector dependency -->
  <dependency>
    <groupId>com.datastax.spark</groupId>
    <artifactId>spark-cassandra-connector-java_2.10</artifactId>
    <version>1.2.0</version>
  </dependency> 

<!-- Spark Core dependency -->
        <dependency>
                <groupId>org.apache.spark</groupId>
                <artifactId>spark-core_2.11</artifactId>
                <version>1.3.1</version>
        </dependency>
    <!-- Spark dependency -->
        <dependency>
                 <groupId>org.apache.spark</groupId>
                 <artifactId>spark-streaming_2.11</artifactId>
                <version>1.3.1</version>
        </dependency>
    <!-- Spark dependency -->
        <dependency>
                <groupId>org.apache.spark</groupId>
                <artifactId>spark-streaming-kafka_2.10</artifactId>
                <version>1.3.1</version>
        </dependency>
  </dependencies>

and i build using

我使用

<build>
      <plugins>
        <plugin>
            <groupId>org.apache.maven.plugins</groupId>
            <artifactId>maven-compiler-plugin</artifactId>
            <version>2.3.2</version>
            <configuration>
                <source>1.7</source>
                <target>1.7</target>
            </configuration>
        </plugin>
           <plugin>
                <groupId>org.apache.maven.plugins</groupId>
                <artifactId>maven-shade-plugin</artifactId>
                <version>2.3</version>
                <executions>
                    <execution>
                        <phase>package</phase>
                        <goals>
                            <goal>shade</goal>
                        </goals>
                    </execution>
                </executions>
                <configuration>
                    <filters>
                        <filter>
                            <artifact>*:*</artifact>
                            <excludes>
                                <exclude>META-INF/*.SF</exclude>
                                <exclude>META-INF/*.DSA</exclude>
                                <exclude>META-INF/*.RSA</exclude>
                            </excludes>
                        </filter>
                    </filters>
                    <finalName>uber-${project.artifactId}-${project.version}</finalName>
                </configuration>
            </plugin>
           <plugin>
            <groupId>org.apache.maven.plugins</groupId>
            <artifactId>maven-compiler-plugin</artifactId>
            <version>2.3.2</version>
            <configuration>
                <source>1.7</source>
                <target>1.7</target>
            </configuration>
        </plugin>

      </plugins>
    </build>

but when i submit through code its not working, any help is much appriciated.. I tried added scala2.10.4 prop in pom still no luck

但是当我通过代码提交它不起作用时,任何帮助都非常有用..我尝试在 pom 中添加 scala2.10.4 道具仍然没有运气

I am running in eclipse as run as application with all master,spark home and jars set to sparkConf the error shows exactly in sparkConf

我在 eclipse 中作为应用程序运行,所有 master、spark home 和 jars 都设置为 sparkConf,错误显示在 sparkConf 中

My scala version is

我的 Scala 版本是

scala -version Scala code runner version 2.11.2 -- Copyright 2002-2013, LAMP/EPFL

scala -version Scala 代码运行器版本 2.11.2 -- 版权所有 2002-2013,LAMP/EPFL

is this has anything to do with the issue?

这与问题有关吗?

How to swtich to an older version of scala? In the doc it says spark1.3.1 supports scala 2.10.x versions, please let me know how to fix this

如何切换到旧版本的scala?在文档中它说 spark1.3.1 支持 scala 2.10.x 版本,请告诉我如何解决这个问题

采纳答案by Maksud

The problem you are experiencing is due to the incompatibilities in Scala versions. Prebuild Spark 1.3.1 distribution is compiled with older Scala 2.10 because some of the Spark dependencies are not supported under 2.11, including JDBC support.

您遇到的问题是由于 Scala 版本不兼容。预构建 Spark 1.3.1 发行版是使用较旧的 Scala 2.10 编译的,因为在 2.11 下不支持某些 Spark 依赖项,包括 JDBC 支持。

I would suggest to run your Spark cluster with Scala 2.10. However, if you want you can also compile your Spark package with Scala 2.11 in the following way:

我建议使用 Scala 2.10 运行您的 Spark 集群。但是,如果您愿意,也可以通过以下方式使用 Scala 2.11 编译 Spark 包:

dev/change-version-to-2.11.sh
mvn -Pyarn -Phadoop-2.4 -Dscala-2.11 -DskipTests clean package

回答by Santosh Hencha

i was experiencing same issue in scala IDE.and below steps resolved that.

我在 Scala IDE 中遇到了同样的问题。下面的步骤解决了这个问题。

Note:-check compatibility as per your scala-spark. for me it is scala version - 2.11.* is compatible with spark 2.4.*

注意:-根据您的 scala-spark 检查兼容性。对我来说它是 Scala 版本 - 2.11.* 与 spark 2.4.* 兼容

Go the project >> right click >> properties >> scala compiler >> select "use project settings" option >> and change "scala installation" >> apply >> apply and close..... good to go.

转到项目>>右键单击>>属性>>scala编译器>>选择“使用项目设置”选项>>并更改“scala安装”>>应用>>应用并关闭.....很好。

click on below image link to see setting of Scala IDE

单击下面的图片链接以查看 Scala IDE 的设置