Java 无法解析主 URL:'spark:http://localhost:18080'
声明:本页面是StackOverFlow热门问题的中英对照翻译,遵循CC BY-SA 4.0协议,如果您需要使用它,必须同样遵循CC BY-SA许可,注明原文地址和作者信息,同时你必须将它归于原作者(不是我):StackOverFlow
原文地址: http://stackoverflow.com/questions/27250527/
Warning: these are provided under cc-by-sa 4.0 license. You are free to use/share it, But you must attribute it to the original authors (not me):
StackOverFlow
Could not parse Master URL: 'spark:http://localhost:18080'
提问by Anas
When I'm trying to run my code it throws this Exception
:
当我尝试运行我的代码时,它会抛出Exception
:
Exception in thread "main" org.apache.spark.SparkException: Could not parse Master URL:spark:http://localhost:18080
This is my code:
这是我的代码:
SparkConf conf = new SparkConf().setAppName("App_Name").setMaster("spark:http://localhost:18080").set("spark.ui.port","18080");
JavaStreamingContext ssc = new JavaStreamingContext(sc, new Duration(1000));
String[] filet=new String[]{"Obama","ISI"};
JavaReceiverInputDStream<Status> reciverStream=TwitterUtils.createStream(ssc,filet);
JavaDStream<String> statuses = reciverStream.map(new Function<Status, String>() {
public String call(Status status) { return status.getText(); }
}
);
ssc.start();
ssc.awaitTermination();}}
Any idea how can I fix this problem?
知道如何解决这个问题吗?
采纳答案by icza
The problem is that you specify 2 schemas in the URL you pass to SparkConf.setMaster()
.
问题是您在传递给的 URL 中指定了 2 个模式SparkConf.setMaster()
。
The spark
is the schema, so you don't need to add http
after spark
. See the javadoc of SparkConf.setMaster()
for more examples.
的spark
是模式,所以你不需要添加别http
后spark
。有关SparkConf.setMaster()
更多示例,请参阅的 javadoc 。
So the master URL you should be using is "spark://localhost:18080"
. Change this line:
所以你应该使用的主 URL 是"spark://localhost:18080"
. 改变这一行:
SparkConf conf = new SparkConf().setAppName("App_Name")
.setMaster("spark://localhost:18080").set("spark.ui.port","18080");
回答by ruze
The standard port for master is 7077 not 18080. Maybe you can try 7077.
master 的标准端口是 7077 而不是 18080。也许你可以试试 7077。