java.lang.ClassNotFoundException: org.apache.hadoop.conf.Configuration

声明:本页面是StackOverFlow热门问题的中英对照翻译,遵循CC BY-SA 4.0协议,如果您需要使用它,必须同样遵循CC BY-SA许可,注明原文地址和作者信息,同时你必须将它归于原作者(不是我):StackOverFlow 原文地址: http://stackoverflow.com/questions/29092926/
Warning: these are provided under cc-by-sa 4.0 license. You are free to use/share it, But you must attribute it to the original authors (not me): StackOverFlow

提示:将鼠标放在中文语句上可以显示对应的英文。显示中英文
时间:2020-08-11 07:20:41  来源:igfitidea点击:

java.lang.ClassNotFoundException: org.apache.hadoop.conf.Configuration

javahadoopfilesystems

提问by iliden

I keep getting this error.I've included the hadoop commons and the core libs in the classpath but still i keep getting this error.Help would be highly appreciated

我一直收到这个错误。我已经在类路径中包含了 hadoop commons 和核心库,但我仍然不断收到这个错误。非常感谢帮助

回答by Abhishek

add dependency to hadoop-core.

将依赖添加到hadoop-core.

回答by Tristan Reid

Here's how to troubleshoot: Look inside the jar that you're executing to see if that class file is actually there:

以下是故障排除的方法:查看您正在执行的 jar 内部以查看该类文件是否确实存在:

jar tvf target/my-jar-with-dependencies.jar | grep hadoop/conf/Configuration.class

If it's not, you need to add it to your classpath or change the way your jar is packaged.

如果不是,则需要将其添加到类路径中或更改 jar 的打包方式。

Are you using Maven or some similar build tool? You may have a dependency with a 'scope', which means that it will only be compiled into your jar in certain circumstances.

您使用的是 Maven 还是一些类似的构建工具?您可能有一个带有“范围”的依赖项,这意味着它只会在某些情况下被编译到您的 jar 中。

    <dependency>
        <groupId>org.apache.hadoop</groupId>
        <artifactId>hadoop-client</artifactId>
        <version>${hadoop.version}</version>
        <scope>provided</scope>
    </dependency>

In this example, the scope tag tells Maven that you're using this dependency for building, but it indicates that the dependency will be provided during runtime, so you'll either need to remove this tag or add the hadoop jar using -cp=/path/to/jar.jarduring runtime. Another example of a scope like this is 'test', which indicates that the jar is only needed in the path during unit tests.

在这个例子中,scope 标签告诉 Maven 你正在使用这个依赖来构建,但它表明依赖将在运行时提供,所以你需要删除这个标签或-cp=/path/to/jar.jar在运行时添加 hadoop jar using 。像这样的范围的另一个例子是“test”,它表明 jar 只在单元测试期间的路径中需要。