scala SPARK/SQL:spark 无法解析符号 toDF

声明:本页面是StackOverFlow热门问题的中英对照翻译,遵循CC BY-SA 4.0协议,如果您需要使用它,必须同样遵循CC BY-SA许可,注明原文地址和作者信息,同时你必须将它归于原作者(不是我):StackOverFlow 原文地址: http://stackoverflow.com/questions/31143840/
Warning: these are provided under cc-by-sa 4.0 license. You are free to use/share it, But you must attribute it to the original authors (not me): StackOverFlow

提示:将鼠标放在中文语句上可以显示对应的英文。显示中英文
时间:2020-10-22 07:19:17  来源:igfitidea点击:

SPARK/SQL:spark can't resolve symbol toDF

scalaapache-spark

提问by yeyimilk

In my project, my external library is spark-assembly-1.3.1-hadoop2.6.0, if I press '.', the IDE inform me toDF(), but it inform me that can't resolve symbol toDF()when I code it in. I'm sorry I can't find the toDF()in Apache Spark doc.

在我的项目中,我的外部库是spark-assembly-1.3.1-hadoop2.6.0,如果我按“.”,IDE 会通知我toDF(),但它会通知我toDF()在编码时无法解析符号。抱歉我toDF()在 Apache Spark 中找不到博士。

case class Feature(name:String, value:Double, time:String, period:String)
val RESRDD = RDD.map(tuple => {
    var bson=new BasicBSONObject();
    bson.put("name",name);
    bson.put("value",value);
    (null,bson);
})

RESRDD
 .map(_._2)
 .map(f => Feature(f.get("name").toString, f.get("value").toString.toDouble))
 .toDF()

回答by zero323

To be able to to use toDFyou have to import sqlContext.implicitsfirst:

为了能够使用,toDF您必须先导入sqlContext.implicits

val sqlContext = new org.apache.spark.sql.SQLContext(sc)
import sqlContext.implicits._

case class Foobar(foo: String, bar: Integer)

val foobarRdd = sc.parallelize(("foo", 1) :: ("bar", 2) :: ("baz", -1) :: Nil).
    map { case (foo, bar) => Foobar(foo, bar) } 

val foobarDf = foobarRdd.toDF
foobarDf.limit(1).show

回答by Vineet Srivastava

This is a very late response to the question but just for the sake of people who are still looking for the answer:

这是对这个问题的一个很晚的回答,但只是为了那些仍在寻找答案的人:

Try the same command on Spark 1.6 it will work.

在 Spark 1.6 上尝试相同的命令,它会起作用。

I was facing the same issue and searched in google and didn't get solution and then I upgraded Spark from 1.5 to 1.6 and it worked.

我遇到了同样的问题并在谷歌搜索并没有得到解决方案,然后我将 Spark 从 1.5 升级到 1.6 并且它起作用了。

If you don't know your Spark version:

如果您不知道您的 Spark 版本:

spark-submit --version (from command prompt)
sc.version (from Scala Shell)