check if a row value is null in spark dataframe

29,791

Considering that sdf is a DataFrame you can use a select statement.

sdf.select("*", when(col("pro").isNull(), lit("new pro")).otherwise(col("pro")))
Share:
29,791
sam
Author by

sam

Updated on September 07, 2020

Comments

  • sam
    sam over 3 years

    I am using a custom function in pyspark to check a condition for each row in a spark dataframe and add columns if condition is true.

    The code is as below:

    from pyspark.sql.types import *
    from pyspark.sql.functions import *
    from pyspark.sql import Row
    
    def customFunction(row):
        if (row.prod.isNull()):
            prod_1 = "new prod"
            return (row + Row(prod_1))
        else:
            prod_1 = row.prod
            return (row + Row(prod_1))
    
    sdf = sdf_temp.map(customFunction)
    sdf.show()
    

    I get the error mention below:

    AttributeError: 'unicode' object has no attribute 'isNull'

    How can I check for null values for specific columns in the current row in my custom function?