windows windows下hadoop启动tasktracker的问题
声明:本页面是StackOverFlow热门问题的中英对照翻译,遵循CC BY-SA 4.0协议,如果您需要使用它,必须同样遵循CC BY-SA许可,注明原文地址和作者信息,同时你必须将它归于原作者(不是我):StackOverFlow
原文地址: http://stackoverflow.com/questions/6276642/
Warning: these are provided under cc-by-sa 4.0 license. You are free to use/share it, But you must attribute it to the original authors (not me):
StackOverFlow
Problem starting tasktracker in hadoop under windows
提问by Charlie Epps
I am trying to use hadoop under windows and I am running into a problem when I want to start tasktracker. For example:
我正在尝试在 windows 下使用 hadoop,当我想启动 tasktracker 时遇到了问题。例如:
$bin/start-all.sh
then the logs writes:
然后日志写道:
2011-06-08 16:32:18,157 ERROR org.apache.hadoop.mapred.TaskTracker: Can not start task tracker because java.io.IOException: Failed to set permissions of path: /tmp/hadoop-Administrator/mapred/local/taskTracker to 0755
at org.apache.hadoop.fs.RawLocalFileSystem.checkReturnValue(RawLocalFileSystem.java:525)
at org.apache.hadoop.fs.RawLocalFileSystem.setPermission(RawLocalFileSystem.java:507)
at org.apache.hadoop.fs.RawLocalFileSystem.mkdirs(RawLocalFileSystem.java:318)
at org.apache.hadoop.fs.FilterFileSystem.mkdirs(FilterFileSystem.java:183)
at org.apache.hadoop.mapred.TaskTracker.initialize(TaskTracker.java:630)
at org.apache.hadoop.mapred.TaskTracker.<init>(TaskTracker.java:1328)
at org.apache.hadoop.mapred.TaskTracker.main(TaskTracker.java:3430)
What's the problem? How can I solve this? Thanks!
有什么问题?我该如何解决这个问题?谢谢!
回答by BRM
I was running into this issue on an installation of 1.0.3 on Windows server. I changed the default directory in hdfs-site.xml so that the directory that hadoop creates for the dfs is a subdir of the cygwin directory like this...
我在 Windows 服务器上安装 1.0.3 时遇到了这个问题。我更改了 hdfs-site.xml 中的默认目录,以便 hadoop 为 dfs 创建的目录是这样的 cygwin 目录的子目录...
...
...
<property>
<name>dfs.name.dir</name>
<value>c:/cygwin/usr/mydir/dfs/logs</value>
</property>
<property>
<name>dfs.data.dir</name>
<value>c:/cygwin/usr/mydir/dfs/data</value>
</property>
</configuration>
This seemed to resolve the problem.
这似乎解决了问题。
The apache documentation for the config files is here
配置文件的 apache 文档在这里
回答by Dave L.
This issue is being tracked at https://issues.apache.org/jira/browse/HADOOP-7682
回答by Mohyt
Use this change owner of hadoop-Admininstrator folder. You can use chown command for that.
使用此更改 hadoop-Admininstrator 文件夹的所有者。您可以为此使用 chown 命令。
回答by gb96
This issue was raised on the Apache Hadoop user mailing list. It appears to be a problem in some release versions of Hadoop and not others.
这个问题是在 Apache Hadoop 用户邮件列表上提出的。这似乎是 Hadoop 的某些发布版本中的一个问题,而不是其他版本。
A simple solution is to download a different version of Hadoop (assuming you do not require a specific Hadoop version for some other reason).
一个简单的解决方案是下载不同版本的 Hadoop(假设您出于其他原因不需要特定的 Hadoop 版本)。
I encountered this exact issue with version 1.0.0 (beta).
我在 1.0.0 (beta) 版中遇到了这个确切的问题。
I then tried 0.23.0 but got a fatal ClassNotFoundException:
然后我尝试了 0.23.0 但得到了一个致命的 ClassNotFoundException:
log4j:ERROR Could not find value for key log4j.appender.NullAppender
log4j:ERROR Could not instantiate appender named "NullAppender".
Exception in thread "main" java.lang.ClassNotFoundException: hadoop-mapreduce-examples-0.23.0.jar
at java.net.URLClassLoader.run(URLClassLoader.java:366)
at java.net.URLClassLoader.run(URLClassLoader.java:355)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
at java.lang.ClassLoader.loadClass(ClassLoader.java:423)
at java.lang.ClassLoader.loadClass(ClassLoader.java:356)
at java.lang.Class.forName0(Native Method)
at java.lang.Class.forName(Class.java:264)
at org.apache.hadoop.util.RunJar.main(RunJar.java:182)
Finally I tried version 0.22.0 and that worked without error. Therefore I recommend you try downloading and installing version 0.22.0: http://hadoop.apache.org/common/releases.html#10+December%2C+2011%3A+release+0.22.0+available
最后我尝试了 0.22.0 版本并且没有错误。因此,我建议您尝试下载并安装 0.22.0 版:http://hadoop.apache.org/common/releases.html#10+December%2C+2011%3A+release+0.22.0+available
回答by QuinnG
There appears to be a permissions issue related to the path/tmp/hadoop-Administrator/mapred/local/taskTracker
as evidenced by the error message
/tmp/hadoop-Administrator/mapred/local/taskTracker
如错误消息所示, 似乎存在与路径相关的权限问题
ERROR org.apache.hadoop.mapred.TaskTracker: Can not start task tracker because java.io.IOException: Failed to set permissions of path: /tmp/hadoop-Administrator/mapred/local/taskTracker
The account the taskTracker is being started under needs the ability to chmod the specified folder. It may need more control, such as being owner, for other aspects. I don't recall the specific permissions required for components in hadoop setup.
正在启动 taskTracker 的帐户需要能够 chmod 指定的文件夹。对于其他方面,它可能需要更多控制,例如成为所有者。我不记得 hadoop 设置中的组件所需的特定权限。
I haven't dealt with the permission setup aspect of Hadoop much, especially on windows (at all), so what I'm saying is based heavily on the error message you've provided. I also haven't dealth with cygwin folder permission, so I don't know the solution to correct it, but hopefully this will get you pointed in the right direction.
我没有过多地处理 Hadoop 的权限设置方面,尤其是在 Windows 上(根本没有),所以我所说的主要基于您提供的错误消息。我也没有处理过 cygwin 文件夹权限,所以我不知道纠正它的解决方案,但希望这能让你指向正确的方向。