Converting a Scala Iterable[tuple] to RDD

13,268

There are a few ways to do this, but the most straightforward way is just to use Spark Context:

import org.apache.spark._
import org.apache.spark.rdd._
import org.apache.spark.SparkContext._

sc.parallelize(YourIterable.toList)

I think sc.Parallelize needs a conversion to List, but it will preserve your structure, thus you will still get a RDD[String,String,Int,Double]

Share:
13,268
oikonomiyaki
Author by

oikonomiyaki

Updated on July 10, 2022

Comments

  • oikonomiyaki
    oikonomiyaki almost 2 years

    I have a list of tuples, (String, String, Int, Double) that I want to convert to Spark RDD.

    In general, how do I convert a Scala Iterable[(a1, a2, a3, ..., an)] into a Spark RDD?