Linux 无法找到或加载主类 org.apache.hadoop.util.VersionInfo

声明:本页面是StackOverFlow热门问题的中英对照翻译,遵循CC BY-SA 4.0协议,如果您需要使用它,必须同样遵循CC BY-SA许可,注明原文地址和作者信息,同时你必须将它归于原作者(不是我):StackOverFlow 原文地址: http://stackoverflow.com/questions/21212629/
Warning: these are provided under cc-by-sa 4.0 license. You are free to use/share it, But you must attribute it to the original authors (not me): StackOverFlow

提示:将鼠标放在中文语句上可以显示对应的英文。显示中英文
时间:2020-08-07 01:52:13  来源:igfitidea点击:

Could not find or load main class org.apache.hadoop.util.VersionInfo

javalinuxapacheubuntuhadoop

提问by usb

I followed "http://codesfusion.blogspot.com/2013/10/setup-hadoop-2x-220-on-ubuntu.html" to install hadoop on ubuntu. But, upon checking the hadoop version I get the following error:

我按照“ http://codesfusion.blogspot.com/2013/10/setup-hadoop-2x-220-on-ubuntu.html”在ubuntu上安装了hadoop。但是,在检查 hadoop 版本时,我收到以下错误:

Error: Could not find or load main class org.apache.hadoop.util.VersionInfo

错误:无法找到或加载主类 org.apache.hadoop.util.VersionInfo

Also, when I try: hdfs namenode -format

另外,当我尝试:hdfs namenode -format

I get the following error:

我收到以下错误:

Error: Could not find or load main class org.apache.hadoop.hdfs.server.namenode.NameNode

错误:无法找到或加载主类 org.apache.hadoop.hdfs.server.namenode.NameNode

The java version used is:

使用的java版本是:

java version "1.7.0_25"
OpenJDK Runtime Environment (IcedTea 2.3.10) (7u25-2.3.10-1ubuntu0.12.04.2)
OpenJDK 64-Bit Server VM (build 23.7-b01, mixed mode)

回答by y?s??la

Try to check:

尝试检查:

  • JAVA_HOME, all PATH related variables in Hadoop config
  • run: . ~/.bashrc(note the dot in front) to make those variables available in your environment. It seems that the guide does not mention this.
  • JAVA_HOME,Hadoop 配置中所有与 PATH 相关的变量
  • 运行:(. ~/.bashrc注意前面的点)使这些变量在您的环境中可用。该指南似乎没有提到这一点。

回答by Stephen C

You probably did not follow the instructions correctly. Here are some things to try and help us / you diagnose this:

您可能没有正确遵循说明。以下是一些可以尝试帮助我们/您诊断的事情:

  • In the shell that you ran hadoop version, run exportand show us the list of relevant environment variables.

  • Show us what you put in the /usr/local/hadoop/etc/hadoop/hadoop-env.shfile.

  • If neither of the above gives you / us any clues, then find and use a text editor to (temporarily) modify the hadoopwrapper shell script. Add the line "set -xv" somewhere near the beginning. Then run hadoop version, and show us what it produces.

  • 在您运行的 shell 中hadoop version,运行export并向我们​​显示相关环境变量的列表。

  • 向我们展示您放入/usr/local/hadoop/etc/hadoop/hadoop-env.sh文件中的内容。

  • 如果以上都没有给你/我们任何线索,那么找到并使用文本编辑器来(临时)修改hadoop包装器 shell 脚本。在开头附近的某处添加“set -xv”行。然后运行hadoop version,向我们展示它产生了什么。

回答by chinglun

Adding this line to ~/.bash_profile worked for me.

将此行添加到 ~/.bash_profile 对我有用。

export HADOOP_PREFIX=/<where ever you install hadoop>/hadoop

So just:

所以就:

  1. $ sudo open ~/.bash_profile then add the aforesaid line
  2. $ source ~/.bash_profile
  1. $ sudo open ~/.bash_profile 然后添加上述行
  2. $ source ~/.bash_profile

Hope this helps (:

希望这可以帮助 (:

回答by Somum

It is a problem of environmental variables setup. Apparently, I didnt find one which can work until NOW. I was trying on 2.6.4. Here is what we should do

这是环境变量设置的问题。显然,我直到现在才找到一个可以工作的。我正在尝试 2.6.4。这是我们应该做的

export HADOOP_HOME=/home/centos/HADOOP/hadoop-2.6.4
export HADOOP_INSTALL=$HADOOP_HOME
export HADOOP_MAPRED_HOME=$HADOOP_HOME
export HADOOP_COMMON_HOME=$HADOOP_HOME
export HADOOP_HDFS_HOME=$HADOOP_HOME
export YARN_HOME=$HADOOP_HOME
export HADOOP_COMMON_LIB_NATIVE_DIR=$HADOOP_HOME/lib/native
export PATH=$PATH:$HADOOP_HOME/sbin:$HADOOP_HOME/bin
export HADOOP_CONF_DIR=$HADOOP_HOME
export HADOOP_PREFIX=$HADOOP_HOME
export HADOOP_LIBEXEC_DIR=$HADOOP_HOME/libexec
export JAVA_LIBRARY_PATH=$HADOOP_HOME/lib/native:$JAVA_LIBRARY_PATH
export HADOOP_CONF_DIR=$HADOOP_PREFIX/etc/hadoop

Add these into your .bashrc and dont forget to do

将这些添加到您的 .bashrc 中,不要忘记这样做

source ~/.bashrc

I think your problem will be solved as was mine.

我认为你的问题会和我的一样得到解决。

回答by Elsayed

I got that error , I fixed that by editing ~/.bashrc as follow

我得到了那个错误,我通过如下编辑 ~/.bashrc 修复了这个错误

export HADOOP_HOME=/usr/local/hadoop
export PATH=$HADOOP_HOME/bin:$PATH

then open terminal and write this command

然后打开终端并编写此命令

source ~/.bashrc

then check

然后检查

hadoop version

回答by lambzee

I was facing the same issue. Although it may seem so simple but took away 2 hrs of my time. I tried all the things above but it didn't help.

我面临着同样的问题。虽然它看起来很简单,但却占用了我 2 小时的时间。我尝试了上述所有方法,但没有帮助。

I just exit the shell i was in and tried again by logging into the system again. Then things worked!

我只是退出我所在的外壳,然后再次登录系统再次尝试。然后事情奏效了!

回答by ozw1z5rd

I got the same problem with hadoop 2.7.2 after I applied the trick shown I was able to start hdfs but later I discovered that the tar archivie I was using was missing some important pieces. So downloading the 2.7.3 everything worked as it is supposed to work.

在应用了显示我能够启动 hdfs 的技巧后,我在 hadoop 2.7.2 上遇到了同样的问题,但后来我发现我使用的 tar 存档缺少一些重要的部分。所以下载 2.7.3 一切正常。

My first suggestion is to download again the tar.gz at the same version or major.

我的第一个建议是再次下载相同版本或主要版本的 tar.gz。

If you are continuing to reading... this how I solved the problem... After a fresh install hadoop was not able to find the jars. I did this small trick:

如果您继续阅读……这是我如何解决问题的……全新安装 hadoop 后无法找到 jars。我做了这个小技巧:

I located where the jars are
I did a symbolic link of the folder to $HADOOP_HOME/share/hadoop/common

我找到了罐子的位置,
我做了一个文件夹的符号链接到 $HADOOP_HOME/share/hadoop/common

ln -s $HADOOP_HOME/share/hadoop/kms/tomcat/webapps/kms/WEB-INF/lib $HADOOP_HOME/share/hadoop/common 

for version command you need hadoop-common-2.7.2.jar, this helped me to find where the jars where stored.

对于版本命令,您需要 hadoop-common-2.7.2.jar,这有助于我找到 jar 的存储位置。

After that...

在那之后...

$ bin/hadoop version 
Hadoop 2.7.2
Subversion https://git-wip-us.apache.org/repos/asf/hadoop.git -r b165c4fe8a74265c792ce23f546c64604acf0e41
Compiled by jenkins on 2016-01-26T00:08Z
Compiled with protoc 2.5.0
From source with checksum d0fda26633fa762bff87ec759ebe689c
This command was run using /opt/hadoop-2.7.2/share/hadoop/kms/tomcat/webapps/kms/WEB-INF/lib/hadoop-common-2.7.2.jar

Of course any hadoop / hdfs command works now.

当然,任何 hadoop / hdfs 命令现在都可以工作。

I'm again an happy man, I know this is not a polite solution but works at least for me.

我再次成为一个快乐的人,我知道这不是一个礼貌的解决方案,但至少对我有用。

回答by Eduardo Sanchez-Ros

I added the environment variables described above but still didn't work. Setting the HADOOP_CLASSPATH as follows in my ~/.bashrc worked for me:

我添加了上述环境变量,但仍然无效。在我的 ~/.bashrc 中按如下方式设置 HADOOP_CLASSPATH 对我有用:

export HADOOP_CLASSPATH=$(hadoop classpath):$HADOOP_CLASSPATH

export HADOOP_CLASSPATH=$(hadoop classpath):$HADOOP_CLASSPATH

回答by Giri

I used

我用了

export PATH=$HADOOP_HOME/bin:$PATH

Instead of

代替

export PATH=$PATH:$HADOOP_HOME/bin

Then it worked for me!

然后它对我有用!