java 读取hadoop SequenceFile时出现java.lang.NoClassDefFoundError
声明:本页面是StackOverFlow热门问题的中英对照翻译,遵循CC BY-SA 4.0协议,如果您需要使用它,必须同样遵循CC BY-SA许可,注明原文地址和作者信息,同时你必须将它归于原作者(不是我):StackOverFlow
原文地址: http://stackoverflow.com/questions/7559256/
Warning: these are provided under cc-by-sa 4.0 license. You are free to use/share it, But you must attribute it to the original authors (not me):
StackOverFlow
java.lang.NoClassDefFoundError when reading hadoop SequenceFile
提问by Arsen Zahray
I am trying to read a SequenceFile
with custom Writeable
in it.
我正在尝试阅读SequenceFile
其中的自定义Writeable
内容。
Here's the code:
这是代码:
public static void main(String[] args) throws IOException {
//String iFile = null;
String uri = "/tmp/part-r-00000";
Configuration conf = new Configuration();
FileSystem fs = FileSystem.get(URI.create(uri), conf);
Path path = new Path(uri);
MyClass value = new MyClass();
SequenceFile.Reader reader = null;
try {
reader = new Reader(fs, path, conf);
while(reader.next(value)){
System.out.println(value.getUrl());
System.out.println(value.getHeader());
System.out.println(value.getImages().size());
break;
}
} catch (Exception e) {// Catch exception if any
System.err.println("Error: " + e.getMessage());
}
finally {
IOUtils.closeStream(reader);
}
}
When I run this, I get following exception:
当我运行这个时,我得到以下异常:
Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/commons/configuration/Configuration
at org.apache.hadoop.metrics2.lib.DefaultMetricsSystem.<init>(DefaultMetricsSystem.java:37)
at org.apache.hadoop.metrics2.lib.DefaultMetricsSystem.<clinit>(DefaultMetricsSystem.java:34)
at org.apache.hadoop.security.UgiInstrumentation.create(UgiInstrumentation.java:51)
at org.apache.hadoop.security.UserGroupInformation.initialize(UserGroupInformation.java:196)
at org.apache.hadoop.security.UserGroupInformation.ensureInitialized(UserGroupInformation.java:159)
at org.apache.hadoop.security.UserGroupInformation.isSecurityEnabled(UserGroupInformation.java:216)
at org.apache.hadoop.security.UserGroupInformation.getLoginUser(UserGroupInformation.java:409)
at org.apache.hadoop.security.UserGroupInformation.getCurrentUser(UserGroupInformation.java:395)
at org.apache.hadoop.fs.FileSystem$Cache$Key.<init>(FileSystem.java:1418)
at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:1319)
at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:226)
at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:109)
at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:210)
at com.iathao.run.site.emr.DecryptMapReduceOutput.main(DecryptMapReduceOutput.java:32)
Caused by: java.lang.ClassNotFoundException: org.apache.commons.configuration.Configuration
at java.net.URLClassLoader.run(URLClassLoader.java:217)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:205)
at java.lang.ClassLoader.loadClass(ClassLoader.java:321)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:294)
at java.lang.ClassLoader.loadClass(ClassLoader.java:266)
... 14 more
All libraries are packaged into the jar file and are present. What's wrong, and how do I fix this?
所有库都打包到 jar 文件中并存在。出了什么问题,我该如何解决?
回答by Praveen Sripati
The hadoop-common-*.jar has to be included for the org.apache.commons.configuration.Configuration class. Put the jar as dependencies.
hadoop-common-*.jar 必须包含在 org.apache.commons.configuration.Configuration 类中。将 jar 作为依赖项。