bash 使用 Hadoop:本地主机:错误:未设置 JAVA_HOME

声明:本页面是StackOverFlow热门问题的中英对照翻译,遵循CC BY-SA 4.0协议,如果您需要使用它,必须同样遵循CC BY-SA许可,注明原文地址和作者信息,同时你必须将它归于原作者(不是我):StackOverFlow 原文地址: http://stackoverflow.com/questions/14325594/
Warning: these are provided under cc-by-sa 4.0 license. You are free to use/share it, But you must attribute it to the original authors (not me): StackOverFlow

提示:将鼠标放在中文语句上可以显示对应的英文。显示中英文
时间:2020-09-09 23:11:24  来源:igfitidea点击:

Working With Hadoop: localhost: Error: JAVA_HOME is not set

bashhadoopubuntu-12.04java-home

提问by Ali Ismail

I'm working with Ubuntu 12.04 LTS.

我正在使用 Ubuntu 12.04 LTS。

I'm going through the hadoop quickstart manual to make a pseudo-distributed operation. It seems simple and straightforward (easy!).

我正在阅读 hadoop 快速入门手册以进行伪分布式操作。它看起来简单明了(简单!)。

However, when I try to run start-all.shI get:

但是,当我尝试运行时,start-all.sh我得到:

localhost: Error: JAVA_HOME is not set.

I've read all the other advice on stackoverflow for this issue and have done the following to ensure JAVA_HOMEis set:

我已经阅读了有关此问题的所有其他关于 stackoverflow 的建议,并已执行以下操作以确保JAVA_HOME已设置:

In /etc/hadoop/conf/hadoop-env.shI have set

/etc/hadoop/conf/hadoop-env.sh我已经设置

JAVA_HOME=/usr/lib/jvm/java-6-oracle
export JAVA_HOME

In /etc/bash.bashrcI have set

/etc/bash.bashrc我已经设置

JAVA_HOME=/usr/lib/jvm/java-6-oracle
export JAVA_HOME
PATH=$PATH:$JAVA_HOME/bin
export PATH

which javareturns:

which java返回:

/usr/bin/java

java –versionworks

java –version作品

echo $JAVA_HOMEreturns:

echo $JAVA_HOME返回:

/usr/lib/jvm/java-6-oracle

I've even tried becoming root and explicitly writing the in the terminal:

我什至尝试成为 root 并在终端中明确写入:

$ JAVA_HOME=/usr/lib/jvm/java-6-oracle
$ export JAVA_HOME
$ start-all.sh

If you could show me how to resolve this error it would be greatly appreciated. I'm thinking that my JAVA_HOMEis being overridden somehow. If that is the case, could you explain to me how to make my exports global?

如果您能告诉我如何解决此错误,将不胜感激。我在想我的JAVA_HOME不知何故被覆盖了。如果是这种情况,您能否向我解释如何使我的出口全球化?

回答by Krishna

I am using hadoop 1.1, and faced the same problem.

我正在使用 hadoop 1.1,并面临同样的问题。

I got it solved through changing JAVA_HOMEvariable in /etc/hadoop/hadoop-env.shas:

我通过将JAVA_HOME变量更改/etc/hadoop/hadoop-env.sh为以下内容来解决它:

export JAVA_HOME=/usr/lib/jvm/<jdk folder>

回答by bitek

The way to solve this problem is to export the JAVA_HOME variable inside the conf/hadoop-env.sh file.

解决这个问题的方法是在conf/hadoop-env.sh文件中导出JAVA_HOME变量。

It doesn't matter if you already exported that variable in ~/.bashrc, it'll still show the error.

如果您已经在 ~/.bashrc 中导出该变量并不重要,它仍然会显示错误。

So editconf/hadoop-env.shand uncomment the line "export JAVA_HOME" and add a proper filesystem path to it, i.e. the path to your Java JDK.

因此,编辑conf/hadoop-env.sh并取消注释“export JAVA_HOME”行并为其添加正确的文件系统路径,即 Java JDK 的路径。

# The Java implementation to use. Required.
export JAVA_HOME="/path/to/java/JDK/"

# 要使用的 Java 实现。必需的。
导出 JAVA_HOME="/path/to/java/JDK/"

回答by Nitesh Chaturvedi

extract from etc/hadoop/hadoop-env.sh

从 etc/hadoop/hadoop-env.sh 中提取

The only required environment variable is JAVA_HOME. All others are optional. When running a distributed configuration it is best to set JAVA_HOME in this file, so that it is correctly defined on remote nodes.

唯一需要的环境变量是 JAVA_HOME。所有其他都是可选的。运行分布式配置时,最好在此文件中设置 JAVA_HOME,以便在远程节点上正确定义。

This means its better and advised to set JAVA_HOME here.. even though the existing definition reads the JAVA_HOME variable. Perhaps its not getting the value of JAVA_HOME from previously set value... standard apache manual does not tell this :( :(

这意味着它更好,建议在此处设置 JAVA_HOME .. 即使现有定义读取 JAVA_HOME 变量。也许它没有从先前设置的值中获得 JAVA_HOME 的值......标准 apache 手册没有说明这一点:( :(

回答by whitewalker

This error is coming from Line 180

此错误来自第 180 行

if [[ -z $JAVA_HOME ]]; then
   echo "Error: JAVA_HOME is not set and could not be found." 1>&2
   exit 1
fi

in libexec/hadoop-config.sh.

libexec/hadoop-config.sh

Try echo $JAVA_HOMEin that script. If it doesn't recognize,

$JAVA_HOME在该脚本中尝试 echo 。如果不认识,

Find your JAVA_HOMEusing this:

找到你的JAVA_HOME用法:

$(readlink -f /usr/bin/javac | sed "s:/bin/javac::")

$(readlink -f /usr/bin/javac | sed "s:/bin/javac::")

and replace the line

并更换线

export JAVA_HOME=${JAVA_HOME}in /etc/hadoop/hadoop-env.shwith JAVA_HOME you got from above command.

export JAVA_HOME=${JAVA_HOME}/etc/hadoop/hadoop-env.shJAVA_HOME 中,您从上面的命令中获得。

回答by codeguru

I also had faced the similar problem in hadoop 1.1I had not noticed that the JAVA_HOMEwas commented in: hadoop/conf/hadoop-env.sh

我在hadoop 1.1 中也遇到过类似的问题, 我没有注意到在以下内容中JAVA_HOME进行了评论:hadoop/conf/hadoop-env.sh

It was

它是

/#JAVA_HOME=/usr/lib/jvm/java-6-oracle

Had to change it to

不得不把它改成

JAVA_HOME=/usr/lib/jvm/java-6-oracle

回答by Mr. Crowley

regardless of debian or any linux flavor, just know that ~/.bash_profilebelongs to specific user and is not system wide. in pseudo-distributed environment hadoop works on localhostso the $JAVA_HOMEin .bash_profile is no use anymore.

无论是 debian 还是任何 linux 风格,只要知道它~/.bash_profile属于特定用户,而不是系统范围内的。在伪分布式环境中 hadoop 可以工作,localhost因此$JAVA_HOME.bash_profile 不再有用。

just export the JAVA_HOME in ~/.bashrcand use it system wide.

只需导出 JAVA_HOME~/.bashrc并在系统范围内使用它。

回答by kometen

Ran into the same issue on ubuntu LTS 16.04. Running bash -vx ./bin/hadoopshowed it tested whether java was a directory. So I changed JAVA_HOME to a folder and it worked.

在 ubuntu LTS 16.04 上遇到了同样的问题。运行bash -vx ./bin/hadoop表明它测试了 java 是否是一个目录。所以我将 JAVA_HOME 更改为一个文件夹并且它起作用了。

++ [[ ! -d /usr/bin/java ]]
++ hadoop_error 'ERROR: JAVA_HOME /usr/bin/java does not exist.'
++ echo 'ERROR: JAVA_HOME /usr/bin/java does not exist.'
ERROR: JAVA_HOME /usr/bin/java does not exist.

So I changed JAVA_HOME in ./etc/hadoop/hadoop-env.shto

所以我在改变JAVA_HOME./etc/hadoop/hadoop-env.sh

export JAVA_HOME=/usr/lib/jvm/java-8-oracle/jre/

and hadoop starts fine. This is also mentioned in this article.

和 hadoop 开始很好。这篇文章中也提到了这一点。

回答by Paul Sanwald

The way to debug this is to put an "echo $JAVA_HOME" in start-all.sh. Are you running your hadoop environment under a different username, or as yourself? If the former, it's very likely that the JAVA_HOME environment variable is not set for that user.

调试它的方法是在 start-all.sh 中放置一个“echo $JAVA_HOME”。您是使用不同的用户名还是以您自己的身份运行您的 hadoop 环境?如果是前者,很可能没有为该用户设置 JAVA_HOME 环境变量。

The other potential problem is that you have specified JAVA_HOME incorrectly, and the value that you have provided doesn't point to a JDK/JRE. Note that "which java" and "java -version" will both work, even if JAVA_HOME is set incorrectly.

另一个潜在问题是您错误地指定了 JAVA_HOME,并且您提供的值未指向 JDK/JRE。请注意,即使 JAVA_HOME 设置不正确,"which java" 和 "java -version" 都可以工作。

回答by Rajarsh

Check if your alternatives is pointing to the right one, you might actually be pointing to a different version and trying to alter the hadoop-env.sh on another installed version.

检查您的替代品是否指向正确的替代品,您实际上可能指向不同的版本并尝试更改另一个已安装版本的 hadoop-env.sh。

-alternatives --install /etc/hadoop/conf [generic_name] [your correct path] priority {for further check man page of alternatives}

-alternatives --install /etc/hadoop/conf [generic_name] [您的正确路径] 优先级{进一步检查替代品的手册页}

to set alternatives manually,

手动设置备选方案,

alternatives --set [generic name] [your current path].

替代方案 --set [通用名称] [您当前的路径]。

回答by Raja Sekaran

Change the JAVA_HOMEvariable in conf/hadoop-env.sh

改变JAVA_HOME变量conf/hadoop-env.sh

export JAVA_HOME=/etc/local/java/<jdk folder>