Convert array<string> into string pyspark dataframe

10,599

Can you try this way. You will have to import the module

import pyspark.sql.functions.*
df.select(concat_ws(',', split(df.emailed)).alias('string_form')).collect()

Let me know if that helps.

-----Update----

Code explained in the link, I modified a bit.

from pyspark.sql.functions import *
from pyspark.sql.types import *

def getter(column):
    col_new=''
    for i,col in enumerate(column):
        if i==0:
           col_new=col
        else:
           col_new=col_new+','+col
    return col_new

getterUDF = udf(getter, StringType())

df.select(getterUDF(Ur_Array_Column))

You can try this as well.

Share:
10,599
user42361
Author by

user42361

Updated on June 04, 2022

Comments

  • user42361
    user42361 almost 2 years

    I have a pyspark dataframe where some of its columns contain array of string (and one column contains nested array). As a result, I cannot write the dataframe to a csv.

    Here is an example of the dataframe that I am dealing with -

        +-------+--------------------+---------+
        |ID     |             emailed| clicked
        +-------+--------------------+---------+
        |9000316|[KBR, NRT, AOR]     |[[AOR]]  
        |9000854|[KBR, NRT, LAX]     | Null 
        |9001996|[KBR, JFK]          |[[JFK]] 
        +-------+--------------------+---------+
    

    I would like to get the following structure, to be saved as a csv.

        +-------+--------------------+---------+
        |ID     |             emailed| clicked
        +-------+--------------------+---------+
        |9000316|KBR, NRT, AOR       | AOR  
        |9000854|KBR, NRT, LAX       | Null 
        |9001996|KBR, JFK            | JFK 
        +-------+--------------------+---------+
    

    I am very new to pyspark. Your help is greatly appreciated. Thank you!