macos OSX 上的 Hadoop“无法从 SCDynamicStore 加载领域信息”

声明:本页面是StackOverFlow热门问题的中英对照翻译,遵循CC BY-SA 4.0协议,如果您需要使用它,必须同样遵循CC BY-SA许可,注明原文地址和作者信息,同时你必须将它归于原作者(不是我):StackOverFlow 原文地址: http://stackoverflow.com/questions/7134723/
Warning: these are provided under cc-by-sa 4.0 license. You are free to use/share it, But you must attribute it to the original authors (not me): StackOverFlow

提示:将鼠标放在中文语句上可以显示对应的英文。显示中英文
时间:2020-10-21 08:19:43  来源:igfitidea点击:

Hadoop on OSX "Unable to load realm info from SCDynamicStore"

macoshadooposx-lion

提问by Travis Nelson

I am getting this error on startup of Hadoop on OSX 10.7:

在 OSX 10.7 上启动 Hadoop 时出现此错误:

Unable to load realm info from SCDynamicStore put: org.apache.hadoop.hdfs.server.namenode.SafeModeException: Cannot create directory /user/travis/input/conf. Name node is in safe mode.

无法从 SCDynamicStore put 加载领域信息:org.apache.hadoop.hdfs.server.namenode.SafeModeException:无法创建目录 /user/travis/input/conf。名称节点处于安全模式。

It doesn't appear to be causing any issues with the functionality of Hadoop.

它似乎不会导致 Hadoop 的功能出现任何问题。

回答by Jeromy Carriere

Matthew Buckett's suggestion in HADOOP-7489 worked for me. Add the following to your hadoop-env.sh file:

Matthew Buckett 在 HADOOP-7489 中的建议对我有用。将以下内容添加到您的 hadoop-env.sh 文件中:

export HADOOP_OPTS="-Djava.security.krb5.realm=OX.AC.UK -Djava.security.krb5.kdc=kdc0.ox.ac.uk:kdc1.ox.ac.uk"

回答by mdaniel

As an update to this (and to address David Williams' point about Java 1.7), I experienced that only setting the .realmand .kdcproperties was insufficient to stop the offending message.

作为对此的更新(并解决David Williams关于 Java 1.7 的观点),我发现仅设置.realm.kdc属性不足以阻止违规消息。

However, by examining the source filethat is omitting the message I was able to determine that setting the .krb5.confproperty to /dev/nullwas enough to suppress the message. Obviously if you actually have a krb5 configuration, better to specify the actual path to it.

但是,通过检查省略消息的源文件,我能够确定将.krb5.conf属性设置/dev/null为足以抑制消息。显然,如果您确实有 krb5 配置,最好指定它的实际路径。

In total, my hadoop-env.shsnippet is as follows:

总的来说,我的hadoop-env.sh片段如下:

HADOOP_OPTS="${HADOOP_OPTS} -Djava.security.krb5.realm= -Djava.security.krb5.kdc="
HADOOP_OPTS="${HADOOP_OPTS} -Djava.security.krb5.conf=/dev/null"

回答by user411279

I'm having the same issue on OS X 10.8.2, Java version 1.7.0_21. Unfortunately, the above solution does not fix the problem with this version :(

我在 OS X 10.8.2,Java 版本 1.7.0_21 上遇到了同样的问题。不幸的是,上述解决方案并没有解决这个版本的问题:(

Edit:I found the solution to this, based on a hint I saw here. In the hadoop-env.shfile, change the JAVA_HOMEsetting to:

编辑:根据我在此处看到的提示,我找到了解决方案。在hadoop-env.sh文件中,将JAVA_HOME设置更改为:

export JAVA_HOME=`/usr/libexec/java_home -v 1.6`

(Note the grave quotes here.)

(注意这里的严重引用。)

回答by btiernay

FYI, you can simplify this further by only specifying the following:

仅供参考,您可以通过仅指定以下内容来进一步简化:

export HADOOP_OPTS="-Djava.security.krb5.realm= -Djava.security.krb5.kdc="

This is mentioned in HADOOP-7489 as well.

HADOOP-7489 中也提到了这一点。

回答by Vladimir Kroz

I had similar problem on MacOS and after trying different combinations this is what worked for me universally (both Hadoop 1.2 and 2.2):

我在 MacOS 上遇到了类似的问题,在尝试了不同的组合后,这对我来说普遍适用(Hadoop 1.2 和 2.2):

in $HADOOP_HOME/conf/hadoop-env.shset the following lines:

$HADOOP_HOME/conf/hadoop-env.sh设置以下几行:

# Set Hadoop-specific environment variables here.
export HADOOP_OPTS="-Djava.security.krb5.realm= -Djava.security.krb5.kdc="

# The java implementation to use.
export JAVA_HOME=`/usr/libexec/java_home -v 1.6`

Hope this will help

希望这会有所帮助

回答by KaKa

and also add

并添加

YARN_OPTS="$YARN_OPTS -Djava.security.krb5.realm=OX.AC.UK -Djava.security.krb5.kdc=kdc0.ox.ac.uk:kdc1.ox.ac.uk"

YARN_OPTS="$YARN_OPTS -Djava.security.krb5.realm=OX.AC.UK -Djava.security.krb5.kdc=kdc0.ox.ac.uk:kdc1.ox.ac.uk"

before executing start-yarn.sh (or start-all.sh) on cdh4.1.3

在 cdh4.1.3 上执行 start-yarn.sh(或 start-all.sh)之前

回答by JnBrymn

I had this error when debugging MapReduce from Eclipse, but it was a red herring. The real problem was that I should have been remote debugging by adding debugging parameters to the JAVA_OPTS

我在从 Eclipse 调试 MapReduce 时遇到了这个错误,但这是一个红鲱鱼。真正的问题是我应该通过向 JAVA_OPTS 添加调试参数来进行远程调试

-Xdebug -Xrunjdwp:transport=dt_socket,server=y,suspend=y,address=1044

And then creating a new "Remote Java Application" profile in the debug configuration that pointed to port 1044.

然后在指向端口 1044 的调试配置中创建一个新的“远程 Java 应用程序”配置文件。

This articlehas some more in-depth information about the debugging side of things. It's talking about Solr, but works much the same with Hadoop. If you have trouble, stick a message below and I'll try to help.

这篇文章有一些关于调试方面的更深入的信息。它在谈论 Solr,但与 Hadoop 的工作原理大致相同。如果您遇到问题,请在下方留言,我会尽力提供帮助。