bash 无法找到或加载主类 org.apache.hadoop.fs.FsShell

声明:本页面是StackOverFlow热门问题的中英对照翻译,遵循CC BY-SA 4.0协议,如果您需要使用它,必须同样遵循CC BY-SA许可,注明原文地址和作者信息,同时你必须将它归于原作者(不是我):StackOverFlow 原文地址: http://stackoverflow.com/questions/38293364/
Warning: these are provided under cc-by-sa 4.0 license. You are free to use/share it, But you must attribute it to the original authors (not me): StackOverFlow

提示:将鼠标放在中文语句上可以显示对应的英文。显示中英文
时间:2020-09-18 14:52:58  来源:igfitidea点击:

Could not find or load main class org.apache.hadoop.fs.FsShell

bashhadoop

提问by mdivk

I understand this question might have been answered already, well, my issue is still here:

我知道这个问题可能已经回答了,好吧,我的问题仍然在这里:

I have a vm created for hadoop on vmware using CentOS7, I can start namenode and datanode, however, when I tried to view hdfs file using the following command:

我使用 CentOS7 在 vmware 上为 hadoop 创建了一个 vm,我可以启动 namenode 和 datanode,但是,当我尝试使用以下命令查看 hdfs 文件时:

hdfs dfs -ls

it throws out an error below:

它抛出以下错误:

Could not find or load main class org.apache.hadoop.fs.FsShell

My google searchings suggest this might relate to hadoop variables setting in bash, here is my settings:

我的谷歌搜索表明这可能与 bash 中的 hadoop 变量设置有关,这是我的设置:

# .bashrc
# Source global definitions
if [ -f /etc/bashrc ]; then
. /etc/bashrc
fi
export HADOOP_HOME=/opt/hadoop/hadoop-2.7.2
export HADOOP_MAPRED_HOME=$HADOOP_HOME
export HADOOP_COMMON_HOME=$HADOOP_HOME
export HADOOP_HDFS_HOME=$HADOOP_HOME
export HADOOP_YARN_HOME=$HADOOP_HOME
export HADOOP_CONF_DIR=$HADOOP_HOME/etc/hadoop
export HADOOP_PREFIX=$HADOOP_HOME

export HIVE_HOME=/opt/hadoop/hive
export PATH=$HIVE_HOME/bin:$PATH

export ANT_HOME=/usr/local/apache-ant-1.9.7
export PATH=${PATH}:${JAVA_HOME}/bin

export PIG_HOME=/opt/hadoop/pig-0.15.0
export PIG_HADOOP_VERSION=0.15.0
export PIG_CLASSPATH=$HADOOP_HOME/etc/hadoop

export PATH=$PATH:$PIG_HOME/bin
export PATH=$PATH:$HADOOP_HOME/bin
export HADOOP_USER_CLASSPATH_FIRST=true

export SQOOP_HOME=/usr/lib/sqoop
export PATH=$PATH:$SQOOP_HOME/bin

export HADOOP_CLASSPATH=$HADOOP_HOME/share/hadoop/common/
export PATH=$PATH:$HADOOP_CLASSPATH

# Uncomment the following line if you don't like systemctl's auto-paging feature
:
# export SYSTEMD_PAGER=
# User specific aliases and functions

I checked my hadoop folder: /opt/hadoop/hadoop-2.7.2/share/hadoop/common, here is the list: enter image description here

我检查了我的 hadoop 文件夹:/opt/hadoop/hadoop-2.7.2/share/hadoop/common,这里是列表: 在此处输入图片说明

I am doing this practice using root account, can anyone help to find out where is the cause of this issue and fix it? Thank you very much.

我正在使用 root 帐户进行此练习,任何人都可以帮助找出导致此问题的原因并修复它吗?非常感谢。

回答by Filmon Gebreyesus

this typically happens when you have multiple instances of hadoop, check which hadoop and see if its pointing out to the version that you have installed.

这通常发生在您有多个 hadoop 实例时,检查哪个 hadoop 并查看它是否指出您已安装的版本。

say if it points to /usr/bin/hadoop and not /your-path/hadoop, then you can point /usr/bin/hadoop to that (with symlink)

说如果它指向 /usr/bin/hadoop 而不是 /your-path/hadoop,那么你可以将 /usr/bin/hadoop 指向它(使用符号链接)

回答by user1872329

As Filmon Gebreyesus pointed out this can happen when you have multiple hadoop instances. First, check what you have in $PATH. There should be paths to the hadoop/bin .

正如 Filmon Gebreyesus 指出的那样,当您有多个 hadoop 实例时,可能会发生这种情况。首先,检查 $PATH 中的内容。应该有到 hadoop/bin 的路径。

If it still not working run whereis hdfs. Check the output. If there is an hdfs which should not be there, remove/move it.

如果它仍然不起作用,请运行whereis hdfs。检查输出。如果有不应该在那里的 hdfs,请删除/移动它。

回答by JMess

Try to unset $HADOOP_COMMON_HOME

尝试取消设置 $HADOOP_COMMON_HOME

unset HADOOP_COMMON_HOME

取消设置 HADOOP_COMMON_HOME