java 用于在 hdfs 中列出目录的主机和端口
声明:本页面是StackOverFlow热门问题的中英对照翻译,遵循CC BY-SA 4.0协议,如果您需要使用它,必须同样遵循CC BY-SA许可,注明原文地址和作者信息,同时你必须将它归于原作者(不是我):StackOverFlow
原文地址: http://stackoverflow.com/questions/27106862/
Warning: these are provided under cc-by-sa 4.0 license. You are free to use/share it, But you must attribute it to the original authors (not me):
StackOverFlow
Host and port to use to list a directory in hdfs
提问by Tristan
First of all, I'm using HortonWorks Sandbox as Hadoop dist, with no custom configuration at all.
首先,我使用 HortonWorks Sandbox 作为 Hadoop dist,完全没有自定义配置。
Once connected on the sandbox, I'm able to list files of a HDFS directory doing :
一旦连接到沙箱,我就可以列出 HDFS 目录的文件:
[root@sandbox ~]# hadoop fs -ls hdfs:///user/guest
[root@sandbox ~]# hadoop fs -ls hdfs:///user/guest
but if I try to specify a host and port I get only errors :
但是如果我尝试指定一个主机和端口,我只会得到错误:
[root@sandbox ~]# hadoop fs -ls hdfs://localhost:8020/user/guest
ls: Call From sandbox.hortonworks.com/10.0.2.15 to localhost:8020 failed on connection exception: java.net.ConnectException: Connexion refusée; For more details see: http://wiki.apache.org/hadoop/ConnectionRefused
[root@sandbox ~]# hadoop fs -ls hdfs://localhost:8020/user/guest
ls: Call From sandbox.hortonworks.com/10.0.2.15 to localhost:8020 failed on connection exception: java.net.ConnectException: Connexion refusée; For more details see: http://wiki.apache.org/hadoop/ConnectionRefused
[root@sandbox ~]# hadoop fs -ls hdfs://localhost:9000/user/guest
ls: Call From sandbox.hortonworks.com/10.0.2.15 to localhost:9000 failed on connection exception: java.net.ConnectException: Connexion refusée; For more details see: http://wiki.apache.org/hadoop/ConnectionRefused
[root@sandbox ~]# hadoop fs -ls hdfs://localhost:9000/user/guest
ls: Call From sandbox.hortonworks.com/10.0.2.15 to localhost:9000 failed on connection exception: java.net.ConnectException: Connexion refusée; For more details see: http://wiki.apache.org/hadoop/ConnectionRefused
Once I know the correct host and port to use, I'll be able to use them in my Java call :
一旦我知道要使用的正确主机和端口,我就可以在我的 Java 调用中使用它们:
Path pt = new Path("hdfs://host:port/user/guest/test-text-file.txt");
Path pt = new Path("hdfs://host:port/user/guest/test-text-file.txt");
回答by Ashrith
Check the value of property fs.defaultFS
in core-site.xml
this contains the ip-address/hostname and port on which NameNode daemon should bind to when it start's up.
检查的财产的价值fs.defaultFS
在core-site.xml
这个包含其中的NameNode守护进程应该绑定到时候开始的了的IP地址/主机名和端口。
I see that you are using hortonworks sandbox, here is the property in core-site.xml
and its located in /etc/hadoop/conf/core-site.xml
我看到您正在使用 hortonworks 沙箱,这是core-site.xml
位于/etc/hadoop/conf/core-site.xml
<property>
<name>fs.defaultFS</name>
<value>hdfs://sandbox.hortonworks.com:8020</value>
</property>
So, you could try something like this:
所以,你可以尝试这样的事情:
hadoop fs -ls hdfs://sandbox.hortonworks.com:8020/user/guest
Or you could also replace the ip address of sandbox.hortonworks.com
from its respective entry in /etc/hosts
, on my vm which looks something like this:
或者,您也可以在我的虚拟机上替换sandbox.hortonworks.com
, 中其各自条目中的 ip 地址,/etc/hosts
如下所示:
127.0.0.1 localhost.localdomain localhost
192.168.1.3 sandbox.hortonworks.com sandbox
So, I could try this as well:
所以,我也可以试试这个:
hadoop fs -ls hdfs://192.168.1.3:8020/user/guest