scala 初始化 SparkContext 时出错:必须在您的配置中设置主 URL
声明:本页面是StackOverFlow热门问题的中英对照翻译,遵循CC BY-SA 4.0协议,如果您需要使用它,必须同样遵循CC BY-SA许可,注明原文地址和作者信息,同时你必须将它归于原作者(不是我):StackOverFlow 
原文地址: http://stackoverflow.com/questions/42032169/
Warning: these are provided under cc-by-sa 4.0 license.  You are free to use/share it, But you must attribute it to the original authors (not me):
StackOverFlow
Error initializing SparkContext: A master URL must be set in your configuration
提问by fakherzad
I used this code
我用了这个代码
My error is:
我的错误是:
Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
17/02/03 20:39:24 INFO SparkContext: Running Spark version 2.1.0
17/02/03 20:39:25 WARN NativeCodeLoader: Unable to load native-hadoop 
library for your platform... using builtin-java classes where applicable
17/02/03 20:39:25 WARN SparkConf: Detected deprecated memory fraction 
settings: [spark.storage.memoryFraction]. As of Spark 1.6, execution and  
storage memory management are unified. All memory fractions used in the old 
model are now deprecated and no longer read. If you wish to use the old 
memory management, you may explicitly enable `spark.memory.useLegacyMode` 
(not recommended).
17/02/03 20:39:25 ERROR SparkContext: Error initializing SparkContext.
org.apache.spark.SparkException: A master URL must be set in your 
configuration
at org.apache.spark.SparkContext.<init>(SparkContext.scala:379)
at PCA$.main(PCA.scala:26)
at PCA.main(PCA.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at com.intellij.rt.execution.application.AppMain.main(AppMain.java:144)
17/02/03 20:39:25 INFO SparkContext: Successfully stopped SparkContext
Exception in thread "main" org.apache.spark.SparkException: A master URL must be set in your configuration
at org.apache.spark.SparkContext.<init>(SparkContext.scala:379)
at PCA$.main(PCA.scala:26)
at PCA.main(PCA.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at 
   sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at   
    sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at com.intellij.rt.execution.application.AppMain.main(AppMain.java:144)
Process finished with exit code 1
回答by Hutashan Chandrakar
If you are running spark stand alone then
如果你单独运行 spark
val conf = new SparkConf().setMaster("spark://master") //missing 
and you can pass parameter while submit job
并且您可以在提交作业时传递参数
spark-submit --master spark://master
If you are running spark local then
如果你在本地运行 spark
val conf = new SparkConf().setMaster("local[2]") //missing 
you can pass parameter while submit job
您可以在提交作业时传递参数
spark-submit --master local
if you are running spark on yarn then
如果你在纱线上运行火花那么
spark-submit --master yarn
回答by Yuval Itzchakov
Error message is pretty clear, you have to provide the address of the Spark Master node, either via the SparkContextor via spark-submit:
错误消息非常清楚,您必须通过SparkContext或通过提供 Spark 主节点的地址spark-submit:
val conf = 
  new SparkConf()
    .setAppName("ClusterScore")
    .setMaster("spark://172.1.1.1:7077") // <--- This is what's missing
    .set("spark.storage.memoryFraction", "1")
val sc = new SparkContext(conf)
回答by Shyam Gupta
 SparkConf configuration = new SparkConf()
            .setAppName("Your Application Name")
            .setMaster("local");
 val sc = new SparkContext(conf);
It will work...
它会工作...

