在 Spark 2.x 中使用 Scala 2.12

声明:本页面是StackOverFlow热门问题的中英对照翻译,遵循CC BY-SA 4.0协议,如果您需要使用它,必须同样遵循CC BY-SA许可,注明原文地址和作者信息,同时你必须将它归于原作者(不是我):StackOverFlow 原文地址: http://stackoverflow.com/questions/42887359/
Warning: these are provided under cc-by-sa 4.0 license. You are free to use/share it, But you must attribute it to the original authors (not me): StackOverFlow

提示:将鼠标放在中文语句上可以显示对应的英文。显示中英文
时间:2020-10-22 09:08:11  来源:igfitidea点击:

Using Scala 2.12 with Spark 2.x

scalaapache-sparkabibinary-compatibility

提问by NetanelRabinowitz

At the Spark 2.1 docsit's mentioned that

在 Spark 2.1文档中提到

Spark runs on Java 7+, Python 2.6+/3.4+ and R 3.1+. For the Scala API, Spark 2.1.0 uses Scala 2.11. You will need to use a compatible Scala version (2.11.x).

Spark 在 Java 7+、Python 2.6+/3.4+ 和 R 3.1+ 上运行。对于 Scala API,Spark 2.1.0 使用 Scala 2.11。您将需要使用兼容的 Scala 版本 (2.11.x)。

at the Scala 2.12 release newsit's also mentioned that:

在 Scala 2.12发布新闻中还提到:

Although Scala 2.11 and 2.12 are mostly source compatible to facilitate cross-building, they are not binary compatible. This allows us to keep improving the Scala compiler and standard library.

尽管 Scala 2.11 和 2.12 大多是源代码兼容的以促进交叉构建,但它们不是二进制兼容的。这使我们能够不断改进 Scala 编译器和标准库。

But when I build an uber jar (using Scala 2.12) and run it on Spark 2.1. every thing work just fine.

但是当我构建一个 uber jar(使用 Scala 2.12)并在 Spark 2.1 上运行它时。一切正常。

and I know its not any official source but at the 47 degree blogthey mentioned that Spark 2.1 does support Scala 2.12.

我知道它不是任何官方来源,但在47 度博客中他们提到 Spark 2.1 确实支持 Scala 2.12。

How can one explain those (conflicts?) pieces of information ?

如何解释那些(冲突?)信息?

回答by user7735456

Spark does notsupport Scala 2.12. You can follow SPARK-14220(Build and test Spark against Scala 2.12) to get up to date status.

Spark支持 Scala 2.12。您可以遵循SPARK-14220针对 Scala 2.12 构建和测试 Spark)以获取最新状态。

update: Spark 2.4 addedan experimental Scala 2.12 support.

更新Spark 2.4 添加了一个实验性的 Scala 2.12 支持。

回答by denvercoder9

To add to the answer, I believe it is a typo https://spark.apache.org/releases/spark-release-2-0-0.htmlhas no mention of scala 2.12.

为了补充答案,我认为这是一个错字https://spark.apache.org/releases/spark-release-2-0-0.html没有提到 scala 2.12。

Also, if we look at timings Scala 2.12 was not released untill November 2016 and Spark 2.0.0 was released on July 2016.

此外,如果我们看一下时间,Scala 2.12 直到 2016 年 11 月才发布,Spark 2.0.0 于 2016 年 7 月发布。

References: https://spark.apache.org/news/index.html

参考资料:https: //spark.apache.org/news/index.html

www.scala-lang.org/news/2.12.0/

www.scala-lang.org/news/2.12.0/