Spark - “sbt package” - “value $ 不是 StringContext 的成员” - 缺少 Scala 插件?

声明:本页面是StackOverFlow热门问题的中英对照翻译,遵循CC BY-SA 4.0协议,如果您需要使用它,必须同样遵循CC BY-SA许可,注明原文地址和作者信息,同时你必须将它归于原作者(不是我):StackOverFlow 原文地址: http://stackoverflow.com/questions/30445476/
Warning: these are provided under cc-by-sa 4.0 license. You are free to use/share it, But you must attribute it to the original authors (not me): StackOverFlow

提示:将鼠标放在中文语句上可以显示对应的英文。显示中英文
时间:2020-10-22 07:11:02  来源:igfitidea点击:

Spark - "sbt package" - "value $ is not a member of StringContext" - Missing Scala plugin?

scalaapache-sparkintellij-ideasbtapache-spark-sql

提问by MiguelPeralvo

When running "sbt package" from the command line for a small Spark Scala application, I'm getting the "value $ is not a member of StringContext" compilation error on the following line of code:

从命令行为小型 Spark Scala 应用程序运行“sbt 包”时,我在以下代码行中收到“值 $ 不是 StringContext 的成员”编译错误:

val joined = ordered.join(empLogins, $"login" === $"username", "inner")
  .orderBy($"count".desc)
  .select("login", "count")

Intellij 13.1 is giving me the same error message. The same .scala source code gets compiled without any issue in Eclipse 4.4.2. And also it works well with maven in a separate maven project from the command line.

Intellij 13.1 给了我同样的错误信息。在 Eclipse 4.4.2 中编译相同的 .scala 源代码没有任何问题。并且它在来自命令行的单独 Maven 项目中与 Maven 配合良好。

It looks like sbt doesn't recognize the $ sign because I'm missing some plugin in my project/plugins.sbt file or some setting in my build.sbt file.

看起来 sbt 无法识别 $ 符号,因为我的 project/plugins.sbt 文件中缺少某些插件或 build.sbt 文件中的某些设置。

Are you familiar with this issue? Any pointers will be appreciated. I can provide build.sbt and/or project/plugins.sbt if needed be.

你熟悉这个问题吗?任何指针将不胜感激。如果需要,我可以提供 build.sbt 和/或 project/plugins.sbt。

回答by Justin Pihony

You need to make sure you import sqlContext.implicits._

你需要确保你 import sqlContext.implicits._

This gets you implicit class StringToColumn extends AnyRef

这让你 implicit class StringToColumn extends AnyRef

Which is commented as:

其中评论为:

Converts $"col name" into an Column.

将 $"col name" 转换为 Column。

回答by mrsrinivas

In Spark 2.0+

在 Spark 2.0+ 中

$-notation for columnscan be used by importing implicit on SparkSessionobject (spark)

可以通过在SparkSession对象 ( spark)上导入隐式来使用列的 $-notation

val spark = org.apache.spark.sql.SparkSession.builder
        .master("local")
        .appName("App name")
        .getOrCreate;

import spark.implicits._

then your code with $ notation

然后你的代码带有 $ 符号

val joined = ordered.join(empLogins, $"login" === $"username", "inner")
  .orderBy($"count".desc)
  .select("login", "count")

回答by Pramit

Great answer guys, if resolving import is a concern, then will this work

很好的回答伙计们,如果解决导入是一个问题,那么这会起作用吗

import org.apache.spark.sql.{SparkSession, SQLContext}
val ss = SparkSession.builder().appName("test").getOrCreate()
val dataDf = ...

import ss.sqlContext.implicits._
dataDf.filter(not($"column_name1" === "condition"))