Java Hadoop:无法为您的平台加载本机 Hadoop 库

声明:本页面是StackOverFlow热门问题的中英对照翻译,遵循CC BY-SA 4.0协议,如果您需要使用它,必须同样遵循CC BY-SA许可,注明原文地址和作者信息,同时你必须将它归于原作者(不是我):StackOverFlow 原文地址: http://stackoverflow.com/questions/37098428/
Warning: these are provided under cc-by-sa 4.0 license. You are free to use/share it, But you must attribute it to the original authors (not me): StackOverFlow

提示:将鼠标放在中文语句上可以显示对应的英文。显示中英文
时间:2020-08-11 18:59:46  来源:igfitidea点击:

Hadoop: Unable to load native-hadoop library for your platform

javahadoop

提问by El Mehdi Belgasmi

I've installed Hadoop 2.7.2 single node on Ubuntu and I want to run the java wordcount program. The compilation and the creation of the jar file are done succesfully, but when I run the jar file on Hadoop I receive this message:

我已经在 Ubuntu 上安装了 Hadoop 2.7.2 单节点,我想运行 java wordcount 程序。jar 文件的编译和创建已成功完成,但是当我在 Hadoop 上运行 jar 文件时,我收到以下消息:

WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable

I set environment variables by editing .bashrc file:

我通过编辑 .bashrc 文件来设置环境变量:

export HADOOP_COMMON_LIB_NATIVE_DIR=$HADOOP_HOME/lib

When I type hadoop checknative -a I get this:

当我输入 hadoop checknative -a 我得到这个:

hadoop: true /usr/local/hadoop/lib/native/libhadoop.so.1.0.0
zlib: true /lib/x86_64-linux-gnu/libz.so.1
snappy: false 
lz4: true revision:99
bzip2: false
openssl: true /usr/lib/x86_64-linux-gnu/libcrypto.so

16/05/09 00:48:53 INFO util.ExitUtil: Exiting with status 1

Hadoop version: 2.7.2

Hadoop 版本:2.7.2

Ubuntu version: 14.04

Ubuntu 版本:14.04

Could anyone give some clues about the issue?

任何人都可以提供有关该问题的一些线索吗?

回答by Nishu Tayal

Move your compiled native library files to $HADOOP_HOME/lib folder.

将编译的本机库文件移动到 $HADOOP_HOME/lib 文件夹。

Then set your environment variables by editing .bashrc file

然后通过编辑 .bashrc 文件设置环境变量

export HADOOP_COMMON_LIB_NATIVE_DIR=$HADOOP_HOME/lib  
export HADOOP_OPTS="$HADOOP_OPTS -Djava.library.path=$HADOOP_HOME/lib"

Make sure your compiled native library files are in $HADOOP_HOME/lib folder.

确保您编译的本机库文件位于 $HADOOP_HOME/lib 文件夹中。

It should fix the issue.

它应该解决这个问题。

回答by Ishan Kumar

Try to load the hadoop jars-"hadoop-common.jar and hadoop-core.jar" to your class path. You can simply do it in eclipse ans while creating a jar file those jars will be referenced automatically.

尝试将 hadoop jars-“hadoop-common.jar 和 hadoop-core.jar”加载到您的类路径。您可以在创建 jar 文件时在 eclipse ans 中简单地执行此操作,这些 jar 将被自动引用。

回答by strom

edit the file hadoop-env.shin /usr/local/etc/hadoop

编辑文件hadoop-env.sh/usr/local/etc/hadoop

Adding Hadoop library into LD_LIBRARY_PATH:

将 Hadoop 库添加到LD_LIBRARY_PATH

export LD_LIBRARY_PATH=/usr/local/hadoop/lib/native/:$LD_LIBRARY_PATH

try it, works for me

试试看,对我有用

回答by Mike Onuorah

Add the commands lines below to hadoop-env.sh, it should suppress the errors encountered

将下面的命令行添加到hadoop-env.sh,它应该抑制遇到的错误

export HADOOP_HOME_WARN_SUPPRESS=1
export HADOOP_ROOT_LOGGER="WARN,DRFA"