java 将 Spark DataFrame 转换为 Pojo 对象

声明:本页面是StackOverFlow热门问题的中英对照翻译,遵循CC BY-SA 4.0协议,如果您需要使用它,必须同样遵循CC BY-SA许可,注明原文地址和作者信息,同时你必须将它归于原作者(不是我):StackOverFlow 原文地址: http://stackoverflow.com/questions/34194019/
Warning: these are provided under cc-by-sa 4.0 license. You are free to use/share it, But you must attribute it to the original authors (not me): StackOverFlow

提示:将鼠标放在中文语句上可以显示对应的英文。显示中英文
时间:2020-11-02 22:35:51  来源:igfitidea点击:

Convert Spark DataFrame to Pojo Object

javaapache-sparkapache-spark-sql

提问by Don Mathew

Please see below code:

请看下面的代码:

    //Create Spark Context
    SparkConf sparkConf = new SparkConf().setAppName("TestWithObjects").setMaster("local");
    JavaSparkContext javaSparkContext = new JavaSparkContext(sparkConf);
    //Creating RDD
    JavaRDD<Person> personsRDD = javaSparkContext.parallelize(persons);
    //Creating SQL context
    SQLContext sQLContext = new SQLContext(javaSparkContext);
    DataFrame personDataFrame = sQLContext.createDataFrame(personsRDD, Person.class);
    personDataFrame.show();
    personDataFrame.printSchema();
    personDataFrame.select("name").show();
    personDataFrame.registerTempTable("peoples");
    DataFrame result = sQLContext.sql("SELECT * FROM peoples WHERE name='test'");
    result.show();

After this I need to convert the DataFrame - 'result' to Person Object or List. Thanks in advance.

在此之后,我需要将 DataFrame - 'result' 转换为 Person 对象或列表。提前致谢。

回答by Rahul

DataFrame?is simply a type alias of?Dataset[Row]?. These operations are also referred as “untyped transformations” in contrast to “typed transformations” that come with strongly typed Scala/Java Datasets.

DataFrame? 只是?Dataset[Row]? 的类型别名。与强类型 Scala/Java 数据集附带的“类型转换”相比,这些操作也称为“无类型转换”。

The conversion from Dataset[Row] to Dataset[Person] is very simple in spark

Spark中Dataset[Row]到Dataset[Person]的转换非常简单

DataFrame result = sQLContext.sql("SELECT * FROM peoples WHERE name='test'");

DataFrame result = sQLContext.sql("SELECT * FROM peoples WHERE name='test'");

At this point, Spark converts your data into DataFrame = Dataset[Row], a collection of generic Row object, since it does not know the exact type.

此时,Spark 将您的数据转换为 DataFrame = Dataset[Row],这是一个通用 Row 对象的集合,因为它不知道确切的类型。

// Create an Encoders for Java beans
Encoder<Person> personEncoder = Encoders.bean(Person.class); 
Dataset<Person> personDF = result.as(personEncoder);
personDF.show();

Now, Spark converts the Dataset[Row] -> Dataset[Person] type-specific Scala / Java JVM object, as dictated by the class Person.

现在,Spark 转换 Dataset[Row] -> Dataset[Person] 类型特定的 Scala / Java JVM 对象,如类 Person 所指示的那样。

Please refer to below link provided by databricks for further details

请参阅下面由 databricks 提供的链接以获取更多详细信息

https://databricks.com/blog/2016/07/14/a-tale-of-three-apache-spark-apis-rdds-dataframes-and-datasets.html

https://databricks.com/blog/2016/07/14/a-tale-of-three-apache-spark-apis-rdds-dataframes-and-datasets.html

回答by Justin Pihony

A DataFrameis stored as Rows, so you can use the methods there to cast from untyped to typed. Take a look at the getmethods.

ADataFrame存储为Rows,因此您可以使用那里的方法从 untyped 转换为 typed。看看get方法。