Linux Hadoop“无法为您的平台加载本机Hadoop库”警告

声明:本页面是StackOverFlow热门问题的中英对照翻译,遵循CC BY-SA 4.0协议,如果您需要使用它,必须同样遵循CC BY-SA许可,注明原文地址和作者信息,同时你必须将它归于原作者(不是我):StackOverFlow 原文地址: http://stackoverflow.com/questions/19943766/
Warning: these are provided under cc-by-sa 4.0 license. You are free to use/share it, But you must attribute it to the original authors (not me): StackOverFlow

提示:将鼠标放在中文语句上可以显示对应的英文。显示中英文
时间:2020-08-07 01:19:31  来源:igfitidea点击:

Hadoop "Unable to load native-hadoop library for your platform" warning

javalinuxhadoophadoop2java.library.path

提问by Olshansk

I'm currently configuring hadoop on a server running CentOs. When I run start-dfs.shor stop-dfs.sh, I get the following error:

我目前正在运行CentOs的服务器上配置 hadoop 。当我运行start-dfs.sh或 时stop-dfs.sh,出现以下错误:

WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable

警告 util.NativeCodeLoader:无法为您的平台加载本机 Hadoop 库...在适用的情况下使用内置 Java 类

I'm running Hadoop 2.2.0.

我正在运行Hadoop 2.2.0。

Doing a search online brought up this link: http://balanceandbreath.blogspot.ca/2013/01/utilnativecodeloader-unable-to-load.html

在线搜索带来了这个链接:http: //balanceandbreath.blogspot.ca/2013/01/utilnativecodeloader-unable-to-load.html

However, the contents of /native/directory on hadoop 2.x appear to be different so I am not sure what to do.

但是,/native/hadoop 2.x上目录的内容似乎不同,所以我不知道该怎么做。

I've also added these two environment variables in hadoop-env.sh:

我还在以下位置添加了这两个环境变量hadoop-env.sh

export HADOOP_OPTS="$HADOOP_OPTS -Djava.library.path=/usr/local/hadoop/lib/"

export HADOOP_COMMON_LIB_NATIVE_DIR="/usr/local/hadoop/lib/native/"

导出 HADOOP_OPTS="$HADOOP_OPTS -Djava.library.path=/usr/local/hadoop/lib/"

导出 HADOOP_COMMON_LIB_NATIVE_DIR="/usr/local/hadoop/lib/native/"

Any ideas?

有任何想法吗?

采纳答案by zhutoulala

I assume you're running Hadoop on 64bit CentOS. The reason you saw that warning is the native Hadoop library $HADOOP_HOME/lib/native/libhadoop.so.1.0.0was actually compiled on 32 bit.

我假设您在 64 位 CentOS 上运行 Hadoop。您看到该警告的原因是本机 Hadoop 库$HADOOP_HOME/lib/native/libhadoop.so.1.0.0实际上是在 32 位上编译的。

Anyway, it's just a warning, and won't impact Hadoop's functionalities.

无论如何,这只是一个警告,不会影响 Hadoop 的功能。

Here is the way if you do want to eliminate this warning, download the source code of Hadoop and recompile libhadoop.so.1.0.0on 64bit system, then replace the 32bit one.

如果你确实想消除这个警告,这里是一种方法,下载 Hadoop 的源代码并libhadoop.so.1.0.0在 64 位系统上重新编译,然后替换 32 位系统。

Steps on how to recompile source code are included here for Ubuntu:

此处包含有关如何为 Ubuntu 重新编译源代码的步骤:

Good luck.

祝你好运。

回答by MikeKulls

For installing Hadoop it is soooooo much easier installing the free version from Cloudera. It comes with a nice GUI that makes it simple to add nodes, there is no compiling or stuffing around with dependencies, it comes with stuff like hive, pig etc.

对于安装 Hadoop,从 Cloudera 安装免费版本要容易得多。它带有一个漂亮的 GUI,可以轻松添加节点,无需编译或填充依赖项,它带有 hive、pig 等东西。

http://www.cloudera.com/content/support/en/downloads.html

http://www.cloudera.com/content/support/en/downloads.html

Steps are: 1) Download 2) Run it 3) Go to web GUI (1.2.3.4:7180) 4) Add extra nodes in the web gui (do NOT install the cloudera software on other nodes, it does it all for you) 5) Within the web GUI go to Home, click Hue and Hue Web UI. This gives you access to Hive, Pig, Sqoop etc.

步骤是:1) 下载 2) 运行它 3) 转到 web GUI (1.2.3.4:7180) 4) 在 web gui 中添加额外的节点(不要在其他节点上安装 cloudera 软件,它会为您完成所有工作) 5) 在 Web GUI 中,转到 Home,单击 Hue 和 Hue Web UI。这使您可以访问 Hive、Pig、Sqoop 等。

回答by user2229544

@zhutoulala -- FWIW your links worked for me with Hadoop 2.4.0 with one exception I had to tell maven not to build the javadocs. I also used the patch in the first link for 2.4.0 and it worked fine. Here's the maven command I had to issue

@zhutoulala - FWIW 你的链接在 Hadoop 2.4.0 上对我有用,但有一个例外,我不得不告诉 maven 不要构建 javadoc。我还在 2.4.0 的第一个链接中使用了补丁,它运行良好。这是我必须发出的 Maven 命令

mvn package -Dmaven.javadoc.skip=true -Pdist,native -DskipTests -Dtar

After building this and moving the libraries, don't forget to update hadoop-env.sh :)

在构建并移动库之后,不要忘记更新 hadoop-env.sh :)

Thought this might help someone who ran into the same roadblocks as me

认为这可能会帮助那些与我遇到相同障碍的人

回答by koti

In my case , after I build hadoop on my 64 bit Linux mint OS, I replaced the native library in hadoop/lib. Still the problem persist. Then I figured out the hadoop pointing to hadoop/libnot to the hadoop/lib/native. So I just moved all content from native library to its parent. And the warning just gone.

就我而言,在我的 64 位 Linux mint 操作系统上构建了 hadoop 之后,我替换了hadoop/lib. 问题仍然存在。然后我发现了指向hadoop/libnot的 hadoop hadoop/lib/native。所以我只是将所有内容从本机库移到其父库。警告刚刚消失。

回答by Neeraj

I had the same issue. It's solved by adding following lines in .bashrc:

我遇到过同样的问题。它通过在 中添加以下行来解决.bashrc

export HADOOP_COMMON_LIB_NATIVE_DIR=$HADOOP_HOME/lib/native
export HADOOP_OPTS="-Djava.library.path=$HADOOP_HOME/lib"

回答by Hoai-Thu Vuong

Just append word nativeto your HADOOP_OPTSlike this:

只是追加字本土HADOOP_OPTS是这样的:

export HADOOP_OPTS="$HADOOP_OPTS -Djava.library.path=$HADOOP_HOME/lib/native"

PS: Thank Searene

PS:感谢Searene

回答by KunBetter

export HADOOP_HOME=/home/hadoop/hadoop-2.4.1  
export PATH=$HADOOP_HOME/bin:$PATH  
export HADOOP_PREFIX=$HADOOP_HOME  
export HADOOP_COMMON_HOME=$HADOOP_PREFIX  
export HADOOP_COMMON_LIB_NATIVE_DIR=$HADOOP_PREFIX/lib/native  
export HADOOP_CONF_DIR=$HADOOP_PREFIX/etc/hadoop  
export HADOOP_HDFS_HOME=$HADOOP_PREFIX  
export HADOOP_MAPRED_HOME=$HADOOP_PREFIX  
export HADOOP_YARN_HOME=$HADOOP_PREFIX  
export JAVA_LIBRARY_PATH=$HADOOP_HOME/lib/native:$JAVA_LIBRARY_PATH

回答by Vijayakumar

Move your compiled native library files to $HADOOP_HOME/libfolder.

将编译的本机库文件移动到$HADOOP_HOME/lib文件夹。

Then set your environment variables by editing .bashrcfile

然后通过编辑.bashrc文件设置环境变量

export HADOOP_COMMON_LIB_NATIVE_DIR=$HADOOP_HOME/lib  
export HADOOP_OPTS="$HADOOP_OPTS -Djava.library.path=$HADOOP_HOME/lib"

Make sure your compiled native library files are in $HADOOP_HOME/libfolder.

确保您编译的本机库文件在$HADOOP_HOME/lib文件夹中。

it should work.

它应该工作。

回答by Kalyan Ghosh

This also would work:

这也可以:

export LD_LIBRARY_PATH=/usr/lib/hadoop/lib/native

回答by Tom Kelly

This line right here:

这一行就在这里:

export JAVA_LIBRARY_PATH=$HADOOP_HOME/lib/native:$JAVA_LIBRARY_PATH

From KunBetter's answer is where the money is

来自 KunBetter 的回答是钱在哪里