JAVA _Home 未在 Hadoop 中设置

声明:本页面是StackOverFlow热门问题的中英对照翻译,遵循CC BY-SA 4.0协议,如果您需要使用它,必须同样遵循CC BY-SA许可,注明原文地址和作者信息,同时你必须将它归于原作者(不是我):StackOverFlow 原文地址: http://stackoverflow.com/questions/20628093/
Warning: these are provided under cc-by-sa 4.0 license. You are free to use/share it, But you must attribute it to the original authors (not me): StackOverFlow

提示:将鼠标放在中文语句上可以显示对应的英文。显示中英文
时间:2020-08-13 03:13:24  来源:igfitidea点击:

JAVA _Home is not set in Hadoop

javahadoopinstallation

提问by ayushman999

I am a beginner with hadoop and trying to install and run hadoop in my Ubuntu as a single node cluster. This is my JAVA_HOME in my hadoop_env.sh

我是 hadoop 的初学者,并尝试在我的 Ubuntu 中安装和运行 hadoop 作为单节点集群。这是我在 hadoop_env.sh 中的 JAVA_HOME

# The java implementation to use.
export JAVA_HOME=/usr/lib/jvm/java-7-openjdk-i386/
export HADOOP_CONF_DIR=${HADOOP_CONF_DIR:-"/etc/hadoop"}

But when I run it the following errors come-

但是当我运行它时出现以下错误-

Starting namenodes on [localhost]
localhost: Error: JAVA_HOME is not set and could not be found.
localhost: Error: JAVA_HOME is not set and could not be found.
Starting secondary namenodes [0.0.0.0]
0.0.0.0: Error: JAVA_HOME is not set and could not be found.

How do I remove this error?

如何消除此错误?

回答by griffon vulture

Are you loading hadoop_env.sh? you may be refering to hadoop-env.sh( dash instead of underscore - that is under conf directory)

你在加载hadoop_env.sh吗?您可能指的是 hadoop-env.sh(破折号而不是下划线 - 即在 conf 目录下)

BTW, This is a very useful guide for quick installation :

顺便说一句,这是一个非常有用的快速安装指南:

http://www.michael-noll.com/tutorials/running-hadoop-on-ubuntu-linux-single-node-cluster/

http://www.michael-noll.com/tutorials/running-hadoop-on-ubuntu-linux-single-node-cluster/

回答by Ankur Shanbhag

Under your HADOOP_HOME/confdirectory please update the hadoop-env.shfile. It has entry to export JAVA_HOME.

请在您的HADOOP_HOME/conf目录下更新hadoop-env.sh文件。它具有导出 JAVA_HOME 的入口。

Setting to appropriate JAVA_HOME in this file should solve your issue.

在此文件中设置为适当的 JAVA_HOME 应该可以解决您的问题。

回答by Sohil Jain

I debugged the code and found out that even though JAVA_HOME is set in the environment, the value is lost as ssh connections to other hosts is made inside the code, and the JAVA_HOME variable that was showing well set in start-dfs.sh became unset in hadoop-env.sh.

我调试了代码,发现即使在环境中设置了 JAVA_HOME,该值也会丢失,因为在代码中建立了与其他主机的 ssh 连接,并且在 start-dfs.sh 中显示良好设置的 JAVA_HOME 变量未设置在 hadoop-env.sh 中。

The solution to this problem will be to set JAVA_HOME variable in hadoop-env.sh and it should work properly.

这个问题的解决方案是在 hadoop-env.sh 中设置 JAVA_HOME 变量,它应该可以正常工作。

回答by toobee

I had the same error and solved it with Soil Jain's remark, but to make it even a bit more clear: the hadoop-env.sh uses an expression such as

我遇到了同样的错误并用 Soil Jain 的评论解决了它,但为了让它更清楚一点:hadoop-env.sh 使用了一个表达式,例如

export JAVA_HOME=${JAVA_HOME}

if you hard-code the path to your JVM installation it works

如果您对 JVM 安装路径进行硬编码,则它可以工作

export JAVA_HOME=/usr/lib/jvm/java...

this resolution by environmental variable as is seems to fail. Hard-coding fixed the problem for me.

这种由环境变量解决的问题似乎失败了。硬编码为我解决了这个问题。

回答by hoang

It not know space between Program and Files: "Program Files". So, I copy folder of jdk to C: or folder which not contains space in name of folder and assign: export JAVA_HOME=Name_Path_Copied. I see it run ok

它不知道程序和文件之间的空间:“程序文件”。因此,我将 jdk 的文件夹复制到 C: 或文件夹名称中不包含空格的文件夹并分配:export JAVA_HOME=Name_Path_Copied。我看到它运行正常

回答by Jo Kachikaran

Above answers should work as long as you are using default conf directory $HADOOP_HOME/confor $HADOOP_HOME/etc/hadoop. Here are a few things you should do if you're using a different conf folder.

只要您使用默认的 conf 目录$HADOOP_HOME/conf$HADOOP_HOME/etc/hadoop. 如果您使用不同的 conf 文件夹,您应该执行以下一些操作。

  1. Copy the hadoop-env.shfile from the default conf directory to your conf folder, say /home/abc/hadoopConf.
  2. Replace the line

    #export JAVA_HOME=${JAVA_HOME}
    

    with the following:

    export JAVA_HOME=/usr/lib/jvm/java-8-oracle
    export HADOOP_CONF_DIR=/home/abc/hadoopConf
    
  1. hadoop-env.sh文件从默认 conf 目录复制到您的 conf 文件夹,例如/home/abc/hadoopConf.
  2. 更换线路

    #export JAVA_HOME=${JAVA_HOME}
    

    具有以下内容:

    export JAVA_HOME=/usr/lib/jvm/java-8-oracle
    export HADOOP_CONF_DIR=/home/abc/hadoopConf
    

Change the values appropriately. If you have any other environment variables related to hadoop configured in your .bashrcor .profileor .bash_profileconsider adding them next to the above lines.

适当更改值。如果您有任何其他环境变量相关的在你的Hadoop来配置.bashrc.profile.bash_profile考虑未来将其添加到上面的线路。

回答by mule.ear

I'm using hadoop 2.8.0. Even though I exported JAVA_HOME (I put it in .bashrc), I still caught this error while trying to run start-dfs.sh.

我正在使用 hadoop 2.8.0。即使我导出了 JAVA_HOME(我把它放在 .bashrc 中),我在尝试运行 start-dfs.sh 时仍然发现了这个错误。

user@host:/opt/hadoop-2.8.0 $ echo $JAVA_HOME
<path_to_java>
user@host:/opt/hadoop-2.8.0 $ $JAVA_HOME/bin/java -version
java version "1.8.0_65"
...
user@host:/opt/hadoop-2.8.0 $ sbin/start-dfs.sh
...
Starting namenodes on []
localhost: Error: JAVA_HOME is not set and could not be found.
localhost: Error: JAVA_HOME is not set and could not be found.

The only way I could get it to run was to add JAVA_HOME=path_to_java to etc/hadoop/hadoop-env.sh and then source it:

我可以让它运行的唯一方法是将 JAVA_HOME=path_to_java 添加到 etc/hadoop/hadoop-env.sh 然后获取它:

:/opt/hadoop-2.8.0 $ grep JAVA_HOME etc/hadoop/hadoop-env.sh
#export JAVA_HOME=${JAVA_HOME}
export JAVA_HOME=path_to_java
user@host:/opt/hadoop-2.8.0 $ source etc/hadoop/hadoop-env.sh

Maybe that (sourcing hadoop-env.sh) was implied in the posts above. Just thought someone should say it out loud. Now it runs. I've encountered other issues (due, I suspect, to the limited resources on the server I'm using), but at least I got past this one.

也许上面的帖子中暗示了(采购 hadoop-env.sh)。只是觉得应该有人大声说出来。现在它运行了。我遇到了其他问题(我怀疑是由于我使用的服务器上的资源有限),但至少我解决了这个问题。

回答by Haha TTpro

First, you must set JAVA_HOME in your hadoop_env.sh. (your local JAVA_HOME in .bashrc would likely to be ignore somehow)

首先,您必须在hadoop_env.sh. (您在 .bashrc 中的本地 JAVA_HOME 可能会以某种方式被忽略)

# The java implementation to use.
export JAVA_HOME=/usr/lib/jvm/default-java

Then, set HADOOP_CONF_DIRpoint to directory of your hadoop_env.sh. In ~/.bashrc, add the following line:

然后,将HADOOP_CONF_DIR指向您的hadoop_env.sh. 在 ~/.bashrc 中,添加以下行:

HADOOP_CONF_DIR="/usr/local/hadoop/etc/hadoop"
export HADOOP_CONF_DIR

Where /usr/local/hadoop/etc/hadoopis the directory contained hadoop_env.sh

/usr/local/hadoop/etc/hadoop包含的目录在哪里hadoop_env.sh