Java:尝试编译 Hadoop 程序时未找到 com.sun.tools.javac.Main

声明:本页面是StackOverFlow热门问题的中英对照翻译,遵循CC BY-SA 4.0协议,如果您需要使用它,必须同样遵循CC BY-SA许可,注明原文地址和作者信息,同时你必须将它归于原作者(不是我):StackOverFlow 原文地址: http://stackoverflow.com/questions/27299273/
Warning: these are provided under cc-by-sa 4.0 license. You are free to use/share it, But you must attribute it to the original authors (not me): StackOverFlow

提示:将鼠标放在中文语句上可以显示对应的英文。显示中英文
时间:2020-11-02 11:34:34  来源:igfitidea点击:

Java: com.sun.tools.javac.Main not found when trying to compile Hadoop program

javahadoop

提问by Kudayar Pirimbaev

When I try to compile my program in Hadoop with this command

当我尝试使用此命令在 Hadoop 中编译我的程序时

bin/hadoop com.sun.tools.javac.Main WordCounter.java

from Hadoop folder, it says

从 Hadoop 文件夹,它说

Error: Could not find or load main class com.sun.tools.javac.Main

I looked in similar threads where people suggested to check if JAVA_HOMEis properly stated. So in etc/hadoop/hadoop-env.shI added this line

我查看了类似的线程,人们建议检查是否JAVA_HOME正确说明。所以在etc/hadoop/hadoop-env.sh我添加了这一行

export JAVA_HOME=/usr/lib/jvm/java-7-openjdk-amd64

then checked if tools.packis properly unpacked in /usr/lib/jvm/java-7-openjdk-amd64/liband it was. Then I tried javac -versionwhich gave

然后检查是否tools.pack已正确打开包装/usr/lib/jvm/java-7-openjdk-amd64/lib并且确实如此。然后我尝试了javac -version哪个给

javac 1.7.0_65

I tried to reinstall Java but it didn't solve the problem.

我尝试重新安装 Java,但没有解决问题。

回答by ponkin

Try to set HADOOP_CLASSPATHenvironment variable

尝试设置HADOOP_CLASSPATH环境变量

export HADOOP_CLASSPATH=$JAVA_HOME/lib/tools.jar

回答by Aaron Digulla

The error means you don't use a JDK to start Hadoop. The main difference between the JRE (pure runtime) and the JDK is the Java compiler javac. To see if you have a Java compiler, you need to check two places: There should be a javacin the $JAVA_HOME/binfolder plus there must be a file $JAVA_HOME/lib/tools.jar.

该错误意味着您没有使用 JDK 来启动 Hadoop。JRE(纯运行时)和 JDK 之间的主要区别在于 Java 编译器javac。要看到,如果你有一个Java编译器,你需要检查两个地方:应该有一个javac$JAVA_HOME/bin文件夹加上必须有一个文件$JAVA_HOME/lib/tools.jar

In your case, the first one (the binary to start the compiler) can be missing but you absolutely need the tools.jar.

在您的情况下,第一个(启动编译器的二进制文件)可能会丢失,但您绝对需要tools.jar.

You say that you have a tools.packbut I haven't heard about this one before. Use your package manager to search for openjdkand then look for a package in the result list which says jdk. On my system, that would be openjdk-7-jdk. Install this package and the error should go away.

你说你有一个,tools.pack但我以前没有听说过这个。使用您的包管理器搜索openjdk,然后在结果列表中查找显示jdk. 在我的系统上,那将是openjdk-7-jdk. 安装这个包,错误就会消失。

回答by William Laszlo

I had to downgrade Hadoop to 2.9.2 and it's working.

我不得不将 Hadoop 降级到 2.9.2 并且它正在工作。

I also had these in my environment:

我的环境中也有这些:

export JAVA_HOME=/usr/lib/jvm/java-1.8.0-openjdk
export PATH=${JAVA_HOME}/bin:${PATH}
export HADOOP_CLASSPATH=${JAVA_HOME}/lib/tools.jar