将 Scala Iterable[tuple] 转换为 RDD

声明:本页面是StackOverFlow热门问题的中英对照翻译,遵循CC BY-SA 4.0协议,如果您需要使用它,必须同样遵循CC BY-SA许可,注明原文地址和作者信息,同时你必须将它归于原作者(不是我):StackOverFlow 原文地址: http://stackoverflow.com/questions/33284507/
Warning: these are provided under cc-by-sa 4.0 license. You are free to use/share it, But you must attribute it to the original authors (not me): StackOverFlow

提示:将鼠标放在中文语句上可以显示对应的英文。显示中英文
时间:2020-10-22 07:44:13  来源:igfitidea点击:

Converting a Scala Iterable[tuple] to RDD

scalaapache-sparkrdd

提问by oikonomiyaki

I have a list of tuples, (String, String, Int, Double) that I want to convert to Spark RDD.

我有一个要转换为 Spark RDD 的元组列表(字符串、字符串、整数、双精度)。

In general, how do I convert a Scala Iterable[(a1, a2, a3, ..., an)] into a Spark RDD?

一般来说,如何将 Scala Iterable[(a1, a2, a3, ..., an)] 转换为 Spark RDD?

采纳答案by GameOfThrows

There are a few ways to do this, but the most straightforward way is just to use Spark Context:

有几种方法可以做到这一点,但最直接的方法就是使用 Spark Context:

import org.apache.spark._
import org.apache.spark.rdd._
import org.apache.spark.SparkContext._

sc.parallelize(YourIterable.toList)

I think sc.Parallelize needs a conversion to List, but it will preserve your structure, thus you will still get a RDD[String,String,Int,Double]

我认为 sc.Parallelize 需要转换为 List,但它会保留你的结构,因此你仍然会得到一个 RDD[String,String,Int,Double]