I have a directory of directories on HDFS, and I want to iterate over the directories. Is there any easy way to do this with Spark using the SparkContext object?
See Question&Answers more detail:osI have a directory of directories on HDFS, and I want to iterate over the directories. Is there any easy way to do this with Spark using the SparkContext object?
See Question&Answers more detail:osYou can use org.apache.hadoop.fs.FileSystem
. Specifically, FileSystem.listFiles([path], true)
And with Spark...
FileSystem.get(sc.hadoopConfiguration).listFiles(..., true)
Edit
It's worth noting that good practice is to get the FileSystem
that is associated with the Path
's scheme.
path.getFileSystem(sc.hadoopConfiguration).listFiles(path, true)