Spark + Python - Java 网关进程在向驱动程序发送其端口号之前退出了吗?
声明:本页面是StackOverFlow热门问题的中英对照翻译,遵循CC BY-SA 4.0协议,如果您需要使用它,必须同样遵循CC BY-SA许可,注明原文地址和作者信息,同时你必须将它归于原作者(不是我):StackOverFlow
原文地址: http://stackoverflow.com/questions/31825911/
Warning: these are provided under cc-by-sa 4.0 license. You are free to use/share it, But you must attribute it to the original authors (not me):
StackOverFlow
Spark + Python - Java gateway process exited before sending the driver its port number?
提问by laukok
Why do I get this error on my browser screen,
为什么我的浏览器屏幕上会出现此错误,
: Java gateway process exited before sending the driver its port number args = ('Java gateway process exited before sending the driver its port number',) message = 'Java gateway process exited before sending the driver its port number'
: Java 网关进程在向驱动程序发送其端口号之前退出 args = ('Java 网关进程在向驱动程序发送其端口号之前退出',) message = 'Java 网关进程在向驱动程序发送其端口号之前退出'
for,
为了,
#!/Python27/python
print "Content-type: text/html; charset=utf-8"
print
# enable debugging
import cgitb
cgitb.enable()
import os
import sys
# Path for spark source folder
os.environ['SPARK_HOME'] = "C:\Apache\spark-1.4.1"
# Append pyspark to Python Path
sys.path.append("C:\Apache\spark-1.4.1\python")
from pyspark import SparkContext
from pyspark import SparkConf
print ("Successfully imported Spark Modules")
# Initialize SparkContext
sc = SparkContext('local')
words = sc.parallelize(["scala","java","hadoop","spark","akka"])
print words.count()
I followed this example.
我跟着这个例子。
Any ideas how I can fix it?
有什么想法可以解决吗?
回答by architectonic
Check if there are any extra information before the Error line that says:
检查错误行之前是否有任何额外的信息说:
Error: Could not create the Java Virtual Machine.
In my case it was an invalid option that I had set in the conf file. Memory (initial heap size) is not allowed to have a comma: 3.5g is for example not acceptable whereas 3500m is.
就我而言,这是我在 conf 文件中设置的无效选项。内存(初始堆大小)不允许有逗号:例如,3.5g 是不可接受的,而 3500m 是。
回答by theheadofabroom
I had a similar issue to this, and eventually when I looked at the my test's output there were error messages from $SPARK_HOME/bin/spark-class
, with line numbers.
我遇到了与此类似的问题,最终当我查看测试的输出时,出现了来自 的错误消息$SPARK_HOME/bin/spark-class
和行号。
After tracking down what was going on on the affected lines it turned out that there were single quotes around the $JAVA_HOME
value in my environmental variables, which was causing issues with path expansion (it was assumed for some reason to be relative to my home directory, rather than an absolute path)
在追踪受影响的行上发生的事情后,结果发现$JAVA_HOME
我的环境变量中的值周围有单引号,这导致路径扩展问题(由于某种原因,它假定与我的主目录相关,而不是比绝对路径)
While this may not be your exact issue, it is worth examining the start of your output for extra information to help in narrowing down the root cause.
虽然这可能不是您的确切问题,但值得检查输出的开头以获取额外信息,以帮助缩小根本原因的范围。
回答by user7210421
My friend has met with the same problem as yours. I checked her computer and found that she had two versions of Java in it. I uninstalled the older one and rewrote the $JAVA_HOME value. The problem was solved.
我朋友遇到了和你一样的问题。我检查了她的电脑,发现她有两个版本的 Java。我卸载了旧的并重写了 $JAVA_HOME 值。问题解决了。