scala Spark 2.0 缺少 spark 隐式

声明:本页面是StackOverFlow热门问题的中英对照翻译,遵循CC BY-SA 4.0协议,如果您需要使用它,必须同样遵循CC BY-SA许可,注明原文地址和作者信息,同时你必须将它归于原作者(不是我):StackOverFlow 原文地址: http://stackoverflow.com/questions/39968707/
Warning: these are provided under cc-by-sa 4.0 license. You are free to use/share it, But you must attribute it to the original authors (not me): StackOverFlow

提示:将鼠标放在中文语句上可以显示对应的英文。显示中英文
时间:2020-10-22 08:43:29  来源:igfitidea点击:

Spark 2.0 missing spark implicits

scalaapache-sparkspark-dataframe

提问by TheM00s3

Using Spark 2.0, Im seeing that it is possible to turn a dataframe of row's into a dataframe of case classes. When I try to do so, Im greeted with a message stating to import spark.implicits._. The issue that I have is that Intellij isn't recognizing that as a valid import statement, Im wondering if that has moved and the message hasn't been updated, or if I don't have the correct packages in my build settings, here is my build.sbt

使用 Spark 2.0,我看到可以将行的数据帧转换为案例类的数据帧。当我尝试这样做时,我收到一条消息,说明要导入spark.implicits._. 我遇到的问题是 Intellij 无法将其识别为有效的导入语句,我想知道它是否已移动并且消息尚未更新,或者我的构建设置中是否没有正确的包,请点击此处是我的build.sbt

libraryDependencies ++= Seq(
  "org.mongodb.spark" % "mongo-spark-connector_2.11" % "2.0.0-rc0",
  "org.apache.spark" % "spark-core_2.11" % "2.0.0",
  "org.apache.spark" % "spark-sql_2.11" % "2.0.0"
)

回答by marios

There is no package called spark.implicits.

没有名为spark.implicits.

With sparkhere it refers to SparkSession. If you are inside the REPL the session is already defined as sparkso you can just type:

随着spark这里指SparkSession。如果您在 REPL 中,会话已定义为,spark因此您只需键入:

import spark.implicits._

If you have defined your own SparkSessionsomewhere in your code, then adjust it accordingly:

如果您SparkSession在代码中的某处定义了自己的,则相应地调整它:

val mySpark = SparkSession
  .builder()
  .appName("Spark SQL basic example")
  .config("spark.some.config.option", "some-value")
  .getOrCreate()

// For implicit conversions like converting RDDs to DataFrames
import mySpark.implicits._

回答by Chitral Verma

Spark used sparkidentifier for SparkSession. This is what causes the confusion. If you created it with something like,

Spark 用于sparkSparkSession 的标识符。这就是造成混乱的原因。如果你用类似的东西创建它,

val ss = SparkSession
  .builder()
  .appName("test")
  .master("local[2]")
  .getOrCreate()

The correct way to import implicitswould be,

正确的导入implicits方式是,

import ss.implicits._

Let me know if this helps. Cheers.

如果这有帮助,请告诉我。干杯。