How to convert map to dataframe?

23,300

First covert it to a Seq, then you can use the toDF() function.

val spark = SparkSession.builder.getOrCreate()
import spark.implicits._

val m = Map("A"-> 0.11164610291904906, "B"-> 0.11856755943424617, "C" -> 0.1023171832681312)
val df = m.toSeq.toDF("name", "score")
df.show

Will give you:

+----+-------------------+
|name|              score|
+----+-------------------+
|   A|0.11164610291904906|
|   B|0.11856755943424617|
|   C| 0.1023171832681312|
+----+-------------------+
Share:
23,300
Muz
Author by

Muz

Updated on September 11, 2020

Comments

  • Muz
    Muz over 3 years

    m is a map as following:

    scala> m
    res119: scala.collection.mutable.Map[Any,Any] = Map(A-> 0.11164610291904906, B-> 0.11856755943424617, C -> 0.1023171832681312)
    

    I want to get:

    name  score
    A  0.11164610291904906
    B  0.11856755943424617
    C  0.1023171832681312
    

    How to get the final dataframe?