scala 如何在打开新 SparkContext 之前停止正在运行的 SparkContext
声明:本页面是StackOverFlow热门问题的中英对照翻译,遵循CC BY-SA 4.0协议,如果您需要使用它,必须同样遵循CC BY-SA许可,注明原文地址和作者信息,同时你必须将它归于原作者(不是我):StackOverFlow
原文地址: http://stackoverflow.com/questions/36844075/
Warning: these are provided under cc-by-sa 4.0 license. You are free to use/share it, But you must attribute it to the original authors (not me):
StackOverFlow
How to stop a running SparkContext before opening the new one
提问by Klue
I am executing tests in Scala with Spark creating a SparkContext as follows:
我正在 Scala 中使用 Spark 创建一个 SparkContext 执行测试,如下所示:
val conf = new SparkConf().setMaster("local").setAppName("test")
val sc = new SparkContext(conf)
After the first execution there was no error. But now I am getting this message (and a failed test notification):
第一次执行后没有错误。但现在我收到此消息(以及失败的测试通知):
Only one SparkContext may be running in this JVM (see SPARK-2243).
It looks like I need to check if there is any running SparkContext and stop it before launching a new one (I do not want to allow multiple contexts). How can I do this?
看起来我需要检查是否有任何正在运行的 SparkContext 并在启动新的之前停止它(我不想允许多个上下文)。我怎样才能做到这一点?
UPDATE:
更新:
I tried this, but there is the same error (I am running tests from IntellijIdea and I make the code before executing it):
我试过这个,但有同样的错误(我正在从 IntellijIdea 运行测试,我在执行之前编写了代码):
val conf = new SparkConf().setMaster("local").setAppName("test")
// also tried: .set("spark.driver.allowMultipleContexts", "true")
UPDATE 2:
更新 2:
class TestApp extends SparkFunSuite with TestSuiteBase {
// use longer wait time to ensure job completion
override def maxWaitTimeMillis: Int = 20000
System.clearProperty("spark.driver.port")
System.clearProperty("spark.hostPort")
var ssc: StreamingContext = _
val config: SparkConf = new SparkConf().setMaster("local").setAppName("test")
.set("spark.driver.allowMultipleContexts", "true")
val sc: SparkContext = new SparkContext(config)
//...
test("Test1")
{
sc.stop()
}
}
回答by zero323
To stop existing contextyou can use stopmethod on a given SparkContextinstance.
要停止现有上下文,您可以stop在给定SparkContext实例上使用方法。
import org.apache.spark.{SparkContext, SparkConf}
val conf: SparkConf = ???
val sc: SparkContext = new SparkContext(conf)
...
sc.stop()
To reuse existing context or create a new oneyou can use SparkContex.getOrCreatemethod.
要重用现有上下文或创建新上下文,您可以使用SparkContex.getOrCreate方法。
val sc1 = SparkContext.getOrCreate(conf)
...
val sc2 = SparkContext.getOrCreate(conf)
When used in test suites both methods can be used to achieve different things:
在测试套件中使用时,两种方法都可用于实现不同的目的:
stop- stopping context inafterAllmethod (see for exampleMLlibTestSparkContext.afterAll)getOrCreate- to get active instance in individual test cases (see for exampleQuantileDiscretizerSuite)
stop- 在afterAll方法中停止上下文(参见示例MLlibTestSparkContext.afterAll)getOrCreate- 在单个测试用例中获取活动实例(参见示例QuantileDiscretizerSuite)

