scala 从 SparkSession 检索 SparkContext

声明:本页面是StackOverFlow热门问题的中英对照翻译,遵循CC BY-SA 4.0协议,如果您需要使用它,必须同样遵循CC BY-SA许可,注明原文地址和作者信息,同时你必须将它归于原作者(不是我):StackOverFlow 原文地址: http://stackoverflow.com/questions/46659616/
Warning: these are provided under cc-by-sa 4.0 license. You are free to use/share it, But you must attribute it to the original authors (not me): StackOverFlow

提示:将鼠标放在中文语句上可以显示对应的英文。显示中英文
时间:2020-10-22 09:29:00  来源:igfitidea点击:

Retrieve SparkContext from SparkSession

scalaapache-spark

提问by Naveen Balasubramanian

I'm running a spark batch job and uses SparkSessionas I need a lot of spark-sql features to process in each of my components. The SparkContextis initialized in my parent component and been passed to the child components as SparkSession.

我正在运行一个 spark 批处理作业并使用,SparkSession因为我需要很多 spark-sql 功能来处理我的每个组件。将SparkContext在我父组件初始化,并传给了孩子的组件SparkSession

In one of my child components, I wanted to add two more configurations to my SparkContext. Hence, I need to retrieve the SparkContextfrom the SparkSession, stop it and recreate the SparkSessionwith the additional configuration. To do so, how can I retrieve SparkContext from SparkSession?

在我的一个子组件中,我想向我的SparkContext. 因此,我需要检索SparkContextSparkSession,停止并重新创建SparkSession与额外的配置。为此,我如何从 SparkSession 检索 SparkContext

回答by ayplam

Just to post as an answer - the SparkContext can be accessed from SparkSession using spark.sparkContext(no parenthesis)

只是作为答案发布 - 可以使用spark.sparkContext(无括号)从 SparkSession 访问 SparkContext

回答by ChrisOdney

The sparkContext field does not seem to be public anymore(I am using Spark 2.3.2), however, you can retreive it using a method of the same name:

sparkContext 字段似乎不再公开(我使用的是 Spark 2.3.2),但是,您可以使用同名方法检索它:

spark.sparkContext()

This is applicable to Spark Java only.

这仅适用于 Spark Java。