scala 如何将 Spark RDD 保存到本地文件系统

声明:本页面是StackOverFlow热门问题的中英对照翻译,遵循CC BY-SA 4.0协议,如果您需要使用它,必须同样遵循CC BY-SA许可,注明原文地址和作者信息,同时你必须将它归于原作者(不是我):StackOverFlow 原文地址: http://stackoverflow.com/questions/40226326/
Warning: these are provided under cc-by-sa 4.0 license. You are free to use/share it, But you must attribute it to the original authors (not me): StackOverFlow

提示:将鼠标放在中文语句上可以显示对应的英文。显示中英文
时间:2020-10-22 08:47:01  来源:igfitidea点击:

How to save Spark RDD to local filesystem

scalahadoopapache-sparkdataframehive

提问by roh

can i save a file into the local system with the saveAsTextFilesyntax ? This is how i'm writing the syntax to save a file: insert_df.rdd.saveAsTextFile("<local path>")

我可以使用saveAsTextFile语法将文件保存到本地系统中吗?这就是我编写保存文件的语法的方式: insert_df.rdd.saveAsTextFile("<local path>")

when i'm trying to do this i'm getting error as no permissions, but i have all the permissions to that specific local path, looks like it is treating the file as HDFS file.

当我尝试这样做时,我收到错误,因为没有权限,但我拥有该特定本地路径的所有权限,看起来它正在将该文件视为 HDFS 文件。

回答by Simon Schiff

I think you should try "file:///local path"instead of "/local path".

我认为你应该尝试"file:///local path"而不是"/local path".