scala 错误 SparkContext:初始化 SparkContext 时出错。java.net.BindException:无法分配请求的地址:服务“sparkDriver”失败
声明:本页面是StackOverFlow热门问题的中英对照翻译,遵循CC BY-SA 4.0协议,如果您需要使用它,必须同样遵循CC BY-SA许可,注明原文地址和作者信息,同时你必须将它归于原作者(不是我):StackOverFlow
原文地址: http://stackoverflow.com/questions/44914144/
Warning: these are provided under cc-by-sa 4.0 license. You are free to use/share it, But you must attribute it to the original authors (not me):
StackOverFlow
ERROR SparkContext: Error initializing SparkContext. java.net.BindException: Cannot assign requested address: Service 'sparkDriver' failed
提问by Pankaj Kumar
I have install below setup with version: Hadoop version 1.0.3 java version "1.7.0_67" Scala version 2.11.7 Spark version 2.1.1.
我已经安装了以下版本的设置:Hadoop version 1.0.3 java version "1.7.0_67" Scala version 2.11.7 Spark version 2.1.1。
getting below error, can any one help me this.
低于错误,任何人都可以帮助我。
root@sparkmaster:/home/user# spark-shell
Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
Setting default log level to "WARN".
To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel).
17/07/05 01:07:35 WARN SparkContext: Support for Java 7 is deprecated as of Spark 2.0.0
17/07/05 01:07:36 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
17/07/05 01:07:37 WARN Utils: Service 'sparkDriver' could not bind on port 0. Attempting port 1.
17/07/05 01:07:37 ERROR SparkContext: Error initializing SparkContext.
java.net.BindException: Cannot assign requested address: Service 'sparkDriver' failed after 16 retries (starting from 0)! Consider explicitly setting the appropriate port for the service 'sparkDriver' (for example spark.ui.port for SparkUI) to an available port or increasing
<console>:14: error: not found: value spark
import spark.implicits._
<console>:14: error: not found: value spark
import spark.sql
Using Scala version 2.11.8 (Java HotSpot(TM) Client VM, Java 1.7.0_67)
Type in expressions to have them evaluated.
Type :help for more information.
scala>
回答by Alper t. Turker
There are a few different solutions
有几种不同的解决方案
Get your hostname
$ hostnamethen try to assign your host name
$ sudo hostname -s 127.0.0.1Start
spark-shell.Add your hostname to your /etc/hosts file (if not present)
127.0.0.1 your_hostnameAdd env variable
export SPARK_LOCAL_IP="127.0.0.1" load-spark-env.shAbove steps solved my problem but you can also try to add
export SPARK_LOCAL_IP=127.0.0.1under the comment for local IP on template file
spark-env.sh.template(/usr/local/Cellar/apache-spark/2.1.0/libexec/conf/)and then
cp spark-env.sh.template spark-env.sh spark-shellIf none of the above fixes, check your firewall and enable it, if not already enabled
获取您的主机名
$ hostname然后尝试分配您的主机名
$ sudo hostname -s 127.0.0.1开始
spark-shell。将您的主机名添加到您的 /etc/hosts 文件(如果不存在)
127.0.0.1 your_hostname添加环境变量
export SPARK_LOCAL_IP="127.0.0.1" load-spark-env.sh以上步骤解决了我的问题,但您也可以尝试添加
export SPARK_LOCAL_IP=127.0.0.1在模板文件的本地 IP 注释下
spark-env.sh.template(/usr/local/Cellar/apache-spark/2.1.0/libexec/conf/)接着
cp spark-env.sh.template spark-env.sh spark-shell如果以上都没有修复,请检查您的防火墙并启用它(如果尚未启用)
回答by ktheitroadalo
Add SPARK_LOCAL_IPin load-spark-env.shas
添加SPARK_LOCAL_IP在load-spark-env.sh如
export SPARK_LOCAL_IP="127.0.0.1"
The load-spark-env.shfile is located in spark/bindirectory
该load-spark-env.sh文件位于spark/bin目录中
Or you can add your hostnamein /etc/hostsfile as
或者您可以将您hostname的/etc/hosts文件添加为
127.0.0.1 hostname
You can get your hostnameby typing hostnamein terminal
您可以hostname通过hostname在终端中输入来获取
Hope this solves the issue!
希望这能解决问题!
回答by sonu1986
Had similar issue in my IntelliJ
Reason : I was on cisco anyconnect VPN
Fix : disconnected from the VPN, this issue did not appear
在我的 IntelliJ 中有类似的问题
原因:我在 cisco anyconnect VPN
修复:与VPN断开连接,未出现此问题
回答by linxx
- in your terminal by typing
hostnameyou can have a look at your current hostname. vim /etc/hostsand set the hostname you get just now to your exact ip or 127.0.0.1.
- 在您的终端中输入
hostname您可以查看您当前的主机名。 vim /etc/hosts并将您刚刚获得的主机名设置为您的确切 IP 或 127.0.0.1。

