线程“main”中的异常 java.lang.NoClassDefFoundError: org/apache/hadoop/hbase/HBaseConfiguration

声明:本页面是StackOverFlow热门问题的中英对照翻译,遵循CC BY-SA 4.0协议,如果您需要使用它,必须同样遵循CC BY-SA许可,注明原文地址和作者信息,同时你必须将它归于原作者(不是我):StackOverFlow 原文地址: http://stackoverflow.com/questions/26364057/
Warning: these are provided under cc-by-sa 4.0 license. You are free to use/share it, But you must attribute it to the original authors (not me): StackOverFlow

提示:将鼠标放在中文语句上可以显示对应的英文。显示中英文
时间:2020-08-11 02:22:17  来源:igfitidea点击:

Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/hadoop/hbase/HBaseConfiguration

javahadoophbase

提问by Abhisekh

I am using Hadoop 1.0.3 and HBase 0.94.22. I am trying to run a mapper program to read values from a Hbase table and output them to a file. I am getting the following error:

我正在使用 Hadoop 1.0.3 和 HBase 0.94.22。我正在尝试运行映射器程序以从 Hbase 表中读取值并将它们输出到文件。我收到以下错误:

Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/hadoop/hbase/HBaseConfiguration
    at java.lang.Class.forName0(Native Method)
    at java.lang.Class.forName(Class.java:340)
    at org.apache.hadoop.util.RunJar.main(RunJar.java:149)
Caused by: java.lang.ClassNotFoundException: org.apache.hadoop.hbase.HBaseConfiguration
    at java.net.URLClassLoader.run(URLClassLoader.java:372)
    at java.net.URLClassLoader.run(URLClassLoader.java:361)
    at java.security.AccessController.doPrivileged(Native Method)
    at java.net.URLClassLoader.findClass(URLClassLoader.java:360)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:357)

The code is as below

代码如下

import java.io.IOException;
import org.apache.hadoop.fs.Path;
import org.apache.hadoop.hbase.HBaseConfiguration;
import org.apache.hadoop.hbase.client.Result;
import org.apache.hadoop.hbase.client.Scan;
import org.apache.hadoop.hbase.filter.FirstKeyOnlyFilter;
import org.apache.hadoop.hbase.io.ImmutableBytesWritable;
import org.apache.hadoop.hbase.mapreduce.TableMapReduceUtil;
import org.apache.hadoop.hbase.mapreduce.TableMapper;
import org.apache.hadoop.hbase.util.Bytes;
import org.apache.hadoop.io.IntWritable;
import org.apache.hadoop.io.Text;
import org.apache.hadoop.mapreduce.Job;
import org.apache.hadoop.mapreduce.lib.output.FileOutputFormat;


    public class Test {

    static class TestMapper extends TableMapper<Text, IntWritable> {
        private static final IntWritable one = new IntWritable(1);

        public void map(ImmutableBytesWritable row, Result value, Context context) throws    IOException, InterruptedException
        {
            ImmutableBytesWritable userkey = new ImmutableBytesWritable(row.get(), 0 , Bytes.SIZEOF_INT);
            String key =Bytes.toString(userkey.get());
            context.write(new Text(key), one);

        }
    }


    public static void main(String[] args) throws Exception {

        HBaseConfiguration conf = new HBaseConfiguration();
        Job job = new Job(conf, "hbase_freqcounter");
        job.setJarByClass(Test.class);
        Scan scan = new Scan();

        FileOutputFormat.setOutputPath(job, new Path(args[0]));
        String columns = "data";
        scan.addFamily(Bytes.toBytes(columns));
        scan.setFilter(new FirstKeyOnlyFilter());
        TableMapReduceUtil.initTableMapperJob("test",scan, TestMapper.class, Text.class, IntWritable.class, job);
        job.setOutputKeyClass(Text.class);
        job.setOutputValueClass(IntWritable.class);
        System.exit(job.waitForCompletion(true)?0:1);

    }

}

I get the above code exported to a jar file and on the command line I use the below command to run the above code.

我将上面的代码导出到一个 jar 文件,并在命令行上使用下面的命令来运行上面的代码。

hadoop jar /home/testdb.jar test

hadoop jar /home/testdb.jar 测试

where test is the folder to which the mapper results should be written.

其中 test 是映射器结果应写入的文件夹。

I have checked a few other links like Caused by: java.lang.ClassNotFoundException: org.apache.zookeeper.KeeperExceptionwhere it has been suggested to include the zookeeper file in the classpath, but while creating the project in eclipse I have already included zookeeper file from the lib directory of hbase. The file I have included is zookeeper-3.4.5.jar. Ans also visited this link too HBase - java.lang.NoClassDefFoundError in java, but I am using a mapper class to get the values from the hbase table not any client API. I know I am making a mistake somewhere, guys could you please help me out ??

我检查了一些其他链接,例如由以下原因引起的:java.lang.ClassNotFoundException: org.apache.zookeeper.KeeperException建议在类路径中包含 zookeeper 文件,但是在 eclipse 中创建项目时,我已经包含了 zookeeper hbase 的 lib 目录下的文件。我包含的文件是zookeeper-3.4.5.jar。Ans 也访问了此链接HBase - java.lang.NoClassDefFoundError in java,但我使用映射器类从 hbase 表中获取值,而不是任何客户端 API。我知道我在某个地方犯了一个错误,伙计们,你能帮我吗??

I have noted another strange thing, when I remove all of the code in the main function except the first line " HBaseConfiguration conf = new HBaseConfiguration();", then export the code to a jar file and try to compile the jar file as hadoop jar test.jar I still get the same error. It seems either I am defining the conf variable incorrectly or there is some issue with my environment.

我注意到另一个奇怪的事情,当我删除主函数中除第一行“HBaseConfiguration conf = new HBaseConfiguration();”之外的所有代码时,然后将代码导出到 jar 文件并尝试将 jar 文件编译为 hadoop jar test.jar 我仍然得到同样的错误。看来要么我错误地定义了 conf 变量,要么我的环境存在一些问题。

回答by Abhisekh

I got the fix to the problem, I had not added the hbase classpath in the hadoop-env.sh file. Below is the one I added to make the job work.

我解决了这个问题,我没有在 hadoop-env.sh 文件中添加 hbase 类路径。下面是我添加的使工作正常工作的内容。

$ export HADOOP_CLASSPATH=$HBASE_HOME/hbase-0.94.22.jar:\
    $HBASE_HOME/hbase-0.94.22-test.jar:\
    $HBASE_HOME/conf:\
    ${HBASE_HOME}/lib/zookeeper-3.4.5.jar:\
    ${HBASE_HOME}/lib/protobuf-java-2.4.0a.jar:\
    ${HBASE_HOME}/lib/guava-11.0.2.jar

回答by Spacez

In case there is someone who has different paths/configuration. Here is what I added to hadoop-env.shin order to make it work:

如果有人有不同的路径/配置。这是我添加的内容以hadoop-env.sh使其工作:

$ export HADOOP_CLASSPATH="$HBASE_HOME/lib/hbase-client-0.98.11-hadoop2.jar:\
    $HBASE_HOME/lib/hbase-common-0.98.11-hadoop2.jar:\
    $HBASE_HOME/lib/protobuf-java-2.5.0.jar:\
    $HBASE_HOME/lib/guava-12.0.1.jar:\
    $HBASE_HOME/lib/zookeeper-3.4.6.jar:\
    $HBASE_HOME/lib/hbase-protocol-0.98.11-hadoop2.jar"

NOTE:if you haven't set the $HBASE_HOMEyou have 2 choices. - By export HBASE_HOME=[your hbase installation path]- Or just replace the $HBASE_HOMEwith your hbase full path

注意:如果您还没有设置,$HBASE_HOME您有 2 个选择。- By export HBASE_HOME=[your hbase installation path]- 或者只是$HBASE_HOME用你的 hbase 完整路径替换

回答by Junyong

HADOOP_USER_CLASSPATH_FIRST=true \
HADOOP_CLASSPATH=$($HBASE_HOME/bin/hbase mapredcp) \
hadoop jar  /home/testdb.jar test 

回答by Vaclav C.

I tried editing the hadoop-env.shfile, but the changes mentioned here didn't work for me.

我尝试编辑该hadoop-env.sh文件,但此处提到的更改对我不起作用。

What worked is this:

有效的是这样的:

export HADOOP_CLASSPATH="$HADOOP_CLASSPATH:$HBASE_HOME/lib/*"

I just added that at the end of my hadoop-env.sh. Do not forget to set your HBASE_HOMEvariable. You can also replace the $HBASE_HOMEwith the actual path of your hbase installation.

我只是在我的hadoop-env.sh. 不要忘记设置HBASE_HOME变量。您还可以将 替换$HBASE_HOME为您的 hbase 安装的实际路径。

回答by y durga prasad

here CreateTable is my java class file

这里 CreateTable 是我的 java 类文件

use this command

使用这个命令

java -cp .:/home/hadoop/hbase/hbase-0.94.8/hbase-0.94.8.jar:/home/hadoop/hbase/hbase-0.94.8/lib/* CreateTable

java -cp .:/home/hadoop/hbase/hbase-0.94.8/hbase-0.94.8.jar:/home/hadoop/hbase/hbase-0.94.8/lib/* CreateTable