scala 如何使用scala在spark中创建SQLContext?

声明:本页面是StackOverFlow热门问题的中英对照翻译,遵循CC BY-SA 4.0协议,如果您需要使用它,必须同样遵循CC BY-SA许可,注明原文地址和作者信息,同时你必须将它归于原作者(不是我):StackOverFlow 原文地址: http://stackoverflow.com/questions/34387889/
Warning: these are provided under cc-by-sa 4.0 license. You are free to use/share it, But you must attribute it to the original authors (not me): StackOverFlow

提示:将鼠标放在中文语句上可以显示对应的英文。显示中英文
时间:2020-10-22 07:52:47  来源:igfitidea点击:

How to create SQLContext in spark using scala?

scalaapache-sparksbtapache-spark-sql

提问by Aman

I am creating a Scala program to SQLContextusing sbt. This is my build.sbt:

我正在创建一个SQLContext使用 sbt的 Scala 程序。这是我的 build.sbt:

name := "sampleScalaProject"

version := "1.0"

scalaVersion := "2.11.7"
//libraryDependencies += "org.apache.spark" %% "spark-core" % "2.5.2"
libraryDependencies += "org.apache.spark" % "spark-core_2.11" % "1.5.2"
libraryDependencies += "org.apache.kafka" % "kafka_2.11" % "0.8.2.2"
libraryDependencies += "org.apache.spark" % "spark-streaming_2.11" % "1.5.2"
libraryDependencies += "org.apache.spark" % "spark-sql_2.11" % "1.5.2"
libraryDependencies += "org.apache.hadoop" % "hadoop-common" % "2.6.0"  

And this is test program:

这是测试程序:

import org.apache.spark.SparkContext
import org.apache.spark.sql.SQLContext

object SqlContextSparkScala {

  def main (args: Array[String]) {
    val sc = SparkContext
    val sqlcontext = new SQLContext(sc)
  }
} 

I am getting below error:

我收到以下错误:

Error:(8, 26) overloaded method constructor SQLContext with alternatives:
  (sparkContext: org.apache.spark.api.java.JavaSparkContext)org.apache.spark.sql.SQLContext <and>
  (sparkContext: org.apache.spark.SparkContext)org.apache.spark.sql.SQLContext
 cannot be applied to (org.apache.spark.SparkContext.type)
    val sqlcontexttest = new SQLContext(sc)  

Can anybody please let me know the issue as I am very new to scala and spark programming?

任何人都可以让我知道这个问题,因为我对 Scala 和 Spark 编程很陌生?

采纳答案by Justin Pihony

You need to newyour SparkContextand that should solve it

你需要new你的SparkContext,那应该可以解决它

回答by Shaido - Reinstate Monica

For newer versionsof Spark (2.0+), use SparkSession:

对于较新版本的 Spark (2.0+),请使用SparkSession

val spark = SparkSession.builder.getOrCreate()

SparkSessioncan do everything SQLContextcan do but if needed the SQLContextcan be accessed as follows,

SparkSession可以做所有SQLContext可以做的事情,但如果需要,SQLContext可以按如下方式访问,

val sqlContext = spark.sqlContext

回答by Viraj Wadate

Simply we can create SQLContext in scala

简单地我们可以在 Scala 中创建 SQLContext

scala> val sqlContext = new org.apache.spark.sql.SQLContext(sc);    

回答by Ashutosh S

val conf = new SparkConf().setAppName("SparkJoins").setMaster("local")
val sc = new SparkContext(conf);
val sqlContext = new org.apache.spark.sql.SQLContext(sc);