bash Hadoop start-all.sh 错误:没有那个文件或目录
声明:本页面是StackOverFlow热门问题的中英对照翻译,遵循CC BY-SA 4.0协议,如果您需要使用它,必须同样遵循CC BY-SA许可,注明原文地址和作者信息,同时你必须将它归于原作者(不是我):StackOverFlow
原文地址: http://stackoverflow.com/questions/22796550/
Warning: these are provided under cc-by-sa 4.0 license. You are free to use/share it, But you must attribute it to the original authors (not me):
StackOverFlow
Hadoop start-all.sh error:No such file or directory
提问by Yili Jiang
After I successfully created the name node, I ran into this problem when trying to start name node. For me it seems as if it's trying to log to a file that does not exist. How could I change my setup to direct the script log to the correct directory?
成功创建名称节点后,尝试启动名称节点时遇到了这个问题。对我来说,它似乎试图登录到一个不存在的文件。如何更改我的设置以将脚本日志定向到正确的目录?
bash-3.2$ start-all.sh
starting namenode, logging to /usr/local/bin/../logs/hadoop-Yili-namenode-wifi169-
116.bucknell.edu.out
nice: /usr/local/bin/../bin/hadoop: No such file or directory
localhost: starting datanode, logging to /usr/local/bin/../logs/hadoop-Yili-datanode-
wifi169-116.bucknell.edu.out
localhost: nice: /usr/local/bin/../bin/hadoop: No such file or directory
localhost: starting secondarynamenode, logging to /usr/local/bin/../logs/hadoop-Yili-
secondarynamenode-wifi169-116.bucknell.edu.out
localhost: nice: /usr/local/bin/../bin/hadoop: No such file or directory
starting jobtracker, logging to /usr/local/bin/../logs/hadoop-Yili-jobtracker-wifi169-
116.bucknell.edu.out
nice: /usr/local/bin/../bin/hadoop: No such file or directory
localhost: starting tasktracker, logging to /usr/local/bin/../logs/hadoop-Yili-
tasktracker-wifi169-116.bucknell.edu.out
localhost: nice: /usr/local/bin/../bin/hadoop: No such file or directory
回答by Cast_A_Way
Try to run which hadoop
. If this command gives you an output then your HADOOP_HOME has been set in .bashrc file.
尝试运行which hadoop
。如果此命令为您提供输出,则您的 HADOOP_HOME 已在 .bashrc 文件中设置。
If not set then edit .bashrc file in your home directory and add below statements considering your hadoop is installed in /opt/hadoop
. It may be another location.
如果未设置,则在您的主目录中编辑 .bashrc 文件并考虑您的 hadoop 安装在 .bashrc 中添加以下语句/opt/hadoop
。它可能是另一个位置。
HADOOP_HOME=/opt/HADOOP
export HADOOP_HOME
PATH=$PATH:$HADOOP_HOME/bin
export PATH
This will help you.
这会帮助你。