Java 启动 Hadoop 时出现“错误:无法找到或加载主类”

声明:本页面是StackOverFlow热门问题的中英对照翻译,遵循CC BY-SA 4.0协议,如果您需要使用它,必须同样遵循CC BY-SA许可,注明原文地址和作者信息,同时你必须将它归于原作者(不是我):StackOverFlow 原文地址: http://stackoverflow.com/questions/19680615/
Warning: these are provided under cc-by-sa 4.0 license. You are free to use/share it, But you must attribute it to the original authors (not me): StackOverFlow

提示:将鼠标放在中文语句上可以显示对应的英文。显示中英文
时间:2020-08-12 19:21:04  来源:igfitidea点击:

"Error: Could not find or load main class" when starting Hadoop

javawindowshadoop

提问by Jakub

I'm trying to run Hadoop (2.2.0) on my Windows 7 machine (yes, I know that it would be better to run it on Linux, but it is not an option at this moment). I followed instructions posted at http://ebiquity.umbc.edu/Tutorials/Hadoop/14%20-%20start%20up%20the%20cluster.htmland http://blog.sqltrainer.com/2012/01/installing-and-configuring-apache.html

我正在尝试在我的 Windows 7 机器上运行 Hadoop (2.2.0)(是的,我知道在 Linux 上运行它会更好,但目前它不是一个选项)。我按照http://ebiquity.umbc.edu/Tutorials/Hadoop/14%20-%20start%20up%20the%20cluster.htmlhttp://blog.sqltrainer.com/2012/01/installing- 上发布的说明进行操作和-configuring-apache.html

Evetyhing went fine until I tried to start Hadoop. Every operation I try to run finishes with : Error: Could not find or load main class ...error.
For e.g. running

在我尝试启动 Hadoop 之前,Evetyhin 运行良好。我尝试运行的每个操作都以:Error: Could not find or load main class ...错误结束。
例如跑步

./hadoop version

end up with

以结束

Error: Could not find or load main class org.apache.hadoop.util.VersionInfo

It definitely looks like a problem with classpath. However, I have no idea how to solve it. I tried to set different environment variables, like $HADOOP_COMMON_HOME or $HADOOP_HOME but without luck.

它看起来肯定是类路径的问题。但是,我不知道如何解决它。我尝试设置不同的环境变量,如 $HADOOP_COMMON_HOME 或 $HADOOP_HOME 但没有运气。

Any ideas?

有任何想法吗?

回答by Basim Khajwal

When you usually get this error message, either you are using the wrong Java version or the program was compiled with an older Java version.

当您通常收到此错误消息时,可能是您使用了错误的 Java 版本,或者程序是用较旧的 Java 版本编译的。

You can check your version by opening the cmd (command prompt) and typing java -version.

您可以通过打开 cmd(命令提示符)并输入java -version.

回答by chinglun

Adding this line to ~/.bash_profile worked for me:

将此行添加到 ~/.bash_profile 对我有用:

export HADOOP_PREFIX=/where_ever_you_install_hadoop/hadoop

导出 HADOOP_PREFIX=/where_ever_you_install_hadoop/hadoop

FYI, I have the same answer to this post: Could not find or load main class org.apache.hadoop.util.VersionInfo

仅供参考,我对这篇文章有相同的回答:Could not find or load main class org.apache.hadoop.util.VersionInfo

回答by Gerald Chu

I've also been trying to get Windows 7 up and running with Hadoop. For me, the problem was Hadoop is passing CLASSPATH in Cygwin format

我也一直在尝试使用 Hadoop 启动并运行 Windows 7。对我来说,问题是 Hadoop 以 Cygwin 格式传递 CLASSPATH

CLASSPATH=/cygdrive/c/foo:/cygdrive/c/bar

However, Java expects CLASSPATH in Windows format

但是,Java 需要 Windows 格式的 CLASSPATH

CLASSPATH=c:\foo;c:\bar

Looking at hadoop-0.19.1 showed me how they handled this. You may insert the following statements into bin/hadoop, prior to where it calls Java at the end (and repeat for other java-invoking sh scripts)

查看 hadoop-0.19.1 向我展示了他们如何处理这个问题。您可以将以下语句插入到bin/hadoop, 在它最后调用 Java 的位置之前(并为其他调用 Java 的 sh 脚本重复)

cygwin=false
case "`uname`" in
CYGWIN*) cygwin=true;;
esac

if $cygwin; then
  echo Cygwin
  CLASSPATH=`cygpath -p -w "$CLASSPATH"`
  HADOOP_HOME=`cygpath -d "$HADOOP_HOME"`
  HADOOP_LOG_DIR=`cygpath -d "$HADOOP_LOG_DIR"`
  TOOL_PATH=`cygpath -p -w "$TOOL_PATH"`
fi

export CLASSPATH=$CLASSPATH
echo $CLASSPATH
exec "$JAVA" $JAVA_HEAP_MAX $HADOOP_OPTS $CLASS "$@"

回答by Bala

I had faced this problem myself. This is what solved the problem for me.

我自己也遇到过这个问题。这就是为我解决问题的原因。

Add the following to ~/.bashrc file:

将以下内容添加到 ~/.bashrc 文件中:

export HADOOP_CLASSPATH=$(cygpath -pw $(hadoop classpath)):$HADOOP_CLASSPATH

Note: You can install Hadoop2.2+ directly on Windows. You don't need Cygwin.

注意:您可以直接在 Windows 上安装 Hadoop2.2+。你不需要 Cygwin。

回答by Amardeep Kohli

My problem was the Resource Manager(yarn) was not able to load Hadoop Libraries(jars). I solved this by updating the configurations. Added this to yarn-site.xml :

我的问题是资源管理器(纱线)无法加载 Hadoop 库(罐子)。我通过更新配置解决了这个问题。将此添加到 yarn-site.xml :

<property>
<name>yarn.application.classpath</name>
<value>C:/hadoop-2.8.0/share/hadoop/mapreduce/*,C:/hadoop-2.8.0/share/hadoop/mapreduce/lib/*,C:/Hadoop-2.8.0/share/hadoop/common/*,C:/Hadoop-2.8.0/share/hadoop/common/lib/*,
    C:/hadoop-2.8.0/share/hadoop/hdfs/*,C:/hadoop-2.8.0/share/hadoop/hdfs/lib/*,C:/hadoop-2.8.0/share/hadoop/yarn/*,C:/hadoop-2.8.0/share/hadoop/yarn/lib/*</value>
</property>
<property>
<name>yarn.application.classpath</name>
<value>C:/hadoop-2.8.0/share/hadoop/mapreduce/*,C:/hadoop-2.8.0/share/hadoop/mapreduce/lib/*,C:/Hadoop-2.8.0/share/hadoop/common/*,C:/Hadoop-2.8.0/share/hadoop/common/lib/*,
    C:/hadoop-2.8.0/share/hadoop/hdfs/*,C:/hadoop-2.8.0/share/hadoop/hdfs/lib/*,C:/hadoop-2.8.0/share/hadoop/yarn/*,C:/hadoop-2.8.0/share/hadoop/yarn/lib/*</value>
</property>

Please note that the paths used here can be relative according to your system.

请注意,此处使用的路径可以是相对的,具体取决于您的系统。