scala 错误:未找到:输入 SparkConf

声明:本页面是StackOverFlow热门问题的中英对照翻译,遵循CC BY-SA 4.0协议,如果您需要使用它,必须同样遵循CC BY-SA许可,注明原文地址和作者信息,同时你必须将它归于原作者(不是我):StackOverFlow 原文地址: http://stackoverflow.com/questions/25012320/
Warning: these are provided under cc-by-sa 4.0 license. You are free to use/share it, But you must attribute it to the original authors (not me): StackOverFlow

提示:将鼠标放在中文语句上可以显示对应的英文。显示中英文
时间:2020-10-22 06:27:10  来源:igfitidea点击:

error: not found: type SparkConf

scalaapache-spark

提问by del

I installed spark. pre-compiled and standalone. But both are unable to run val conf = new SparkConf(). The error is error: not found: type SparkConf:

我安装了火花。预编译和独立。但两者都无法运行val conf = new SparkConf()。错误是error: not found: type SparkConf

scala> val conf = new SparkConf()
<console>:10: error: not found: type SparkConf

The pre-compiled is spark 0.9.1 and Scala 2.10.3 The standalone is Spark 1.0.1 and Scala 2.10.4 For the standalone, i compiled it with scala 2.10.4 Your help will be much appreciated

预编译的是 spark 0.9.1 和 Scala 2.10.3 独立版是 Spark 1.0.1 和 Scala 2.10.4 对于独立版,我用 Scala 2.10.4 编译它,你的帮助将不胜感激

回答by om-nom-nom

As we pointed out in comments, your code lacks appropriate import statement:

正如我们在评论中指出的,您的代码缺少适当的导入语句:

import org.apache.spark.SparkConf