从 Scala 脚本中退出 Spark-shell

声明:本页面是StackOverFlow热门问题的中英对照翻译,遵循CC BY-SA 4.0协议,如果您需要使用它,必须同样遵循CC BY-SA许可,注明原文地址和作者信息,同时你必须将它归于原作者(不是我):StackOverFlow 原文地址: http://stackoverflow.com/questions/36054145/
Warning: these are provided under cc-by-sa 4.0 license. You are free to use/share it, But you must attribute it to the original authors (not me): StackOverFlow

提示:将鼠标放在中文语句上可以显示对应的英文。显示中英文
时间:2020-10-22 08:05:35  来源:igfitidea点击:

Exiting Spark-shell from the scala script

scalaapache-spark

提问by Ujjwal SIddharth

I am using this command to run scala scripts.

我正在使用此命令来运行 Scala 脚本。

spark-shell -i test.scala

At the end of the execution of the script I still see spark-shell running.

在脚本执行结束时,我仍然看到 spark-shell 正在运行。

I have used ":q/:quit" in the test.scala script to try and exit but it's not working.

我在 test.scala 脚本中使用了 ":q/:quit" 来尝试退出,但它不起作用。

采纳答案by charles gomes

You need to add exit() at the end of your script to avoid stepping into scala REPL.

您需要在脚本末尾添加 exit() 以避免进入 scala REPL。

Helloworld.scala

helloworld.scala

print("Hello World");
print("Second Line");
print("Bye now");
exit()

Run above

在上面跑

spark-shell -i helloworld.scala

回答by Leif Wickland

I also like the echo :quit | spark-shell ...answer that was offered on another question.

我也喜欢echo :quit | spark-shell ...另一个问题上提供的答案。

回答by abhijitcaps

In version 2.4.3 System.exit(0) is working.

在 2.4.3 版中 System.exit(0) 正在工作。