Java 如何获取目录中文件的绝对路径?

声明:本页面是StackOverFlow热门问题的中英对照翻译,遵循CC BY-SA 4.0协议,如果您需要使用它,必须同样遵循CC BY-SA许可,注明原文地址和作者信息,同时你必须将它归于原作者(不是我):StackOverFlow 原文地址: http://stackoverflow.com/questions/18034758/
Warning: these are provided under cc-by-sa 4.0 license. You are free to use/share it, But you must attribute it to the original authors (not me): StackOverFlow

提示:将鼠标放在中文语句上可以显示对应的英文。显示中英文
时间:2020-08-11 21:56:18  来源:igfitidea点击:

How to get absolute paths of files in a directory?

javahadoopbigdata

提问by Petr Shypila

I have a directory with files, directories, subdirectories, etc. How I can get the list of absolute paths to all files and directories using the Apache Hadoop API?

我有一个包含文件、目录、子目录等的目录。如何使用 Apache Hadoop API 获取所有文件和目录的绝对路径列表?

采纳答案by Tariq

Using HDFS API :

使用 HDFS API:

package org.myorg.hdfsdemo;

import java.io.BufferedReader;
import java.io.IOException;
import java.io.InputStreamReader;

import org.apache.hadoop.conf.Configuration;
import org.apache.hadoop.fs.FileStatus;
import org.apache.hadoop.fs.FileSystem;
import org.apache.hadoop.fs.Path;



public class HdfsDemo {

    public static void main(String[] args) throws IOException {

        Configuration conf = new Configuration();
        conf.addResource(new Path("/Users/miqbal1/hadoop-eco/hadoop-1.1.2/conf/core-site.xml"));
        conf.addResource(new Path("/Users/miqbal1/hadoop-eco/hadoop-1.1.2/conf/hdfs-site.xml"));
        FileSystem fs = FileSystem.get(conf);
        System.out.println("Enter the directory name :");
        BufferedReader br = new BufferedReader(new InputStreamReader(System.in));
        Path path = new Path(br.readLine());
        displayDirectoryContents(fs, path);
    }

    private static void displayDirectoryContents(FileSystem fs, Path rootDir) {
        // TODO Auto-generated method stub
        try {

            FileStatus[] status = fs.listStatus(rootDir);
            for (FileStatus file : status) {
                if (file.isDir()) {
                    System.out.println("This is a directory:" + file.getPath());
                    displayDirectoryContents(fs, file.getPath());
                } else {
                    System.out.println("This is a file:" + file.getPath());
                }
            }
        } catch (IOException e) {
            e.printStackTrace();
        }
    }
}

回答by venkat balabhadra

Writer a recursive function which takes a file and check if its a directory or not, if directory list out all files in it and in a for loop check if the file is a directory then recursively call or just return the list of files.

编写一个递归函数,它接受一个文件并检查它是否是一个目录,如果目录列出其中的所有文件,并在 for 循环中检查该文件是否是一个目录,然后递归调用或仅返回文件列表。

Something like this below but not exactly same (here I am returning only .java files)

像下面这样但不完全相同的东西(这里我只返回 .java 文件)

private static List<File> recursiveDir(File file) {
    if (!file.isDirectory()) {
//          System.out.println("[" + file.getName() + "] is not a valid directory");
        return null;
    }

    List<File> returnList = new ArrayList<File>();
    File[] files = file.listFiles();
    for (File f : files) {
        if (!f.isDirectory()) {
            if (f.getName().endsWith("java")) {
                returnList.add(f);
            }
        } else {
            returnList.addAll(recursiveDir(f));
        }
    }
    return returnList;
}

回答by Rahul

with hdfs you can use hadoop fs -lsr .

使用 hdfs,您可以使用 hadoop fs -lsr 。