Filter Pyspark dataframe column with None value

411,848

Solution 1

You can use Column.isNull / Column.isNotNull:

df.where(col("dt_mvmt").isNull())

df.where(col("dt_mvmt").isNotNull())

If you want to simply drop NULL values you can use na.drop with subset argument:

df.na.drop(subset=["dt_mvmt"])

Equality based comparisons with NULL won't work because in SQL NULL is undefined so any attempt to compare it with another value returns NULL:

sqlContext.sql("SELECT NULL = NULL").show()
## +-------------+
## |(NULL = NULL)|
## +-------------+
## |         null|
## +-------------+


sqlContext.sql("SELECT NULL != NULL").show()
## +-------------------+
## |(NOT (NULL = NULL))|
## +-------------------+
## |               null|
## +-------------------+

The only valid method to compare value with NULL is IS / IS NOT which are equivalent to the isNull / isNotNull method calls.

Solution 2

Try to just use isNotNull function.

df.filter(df.dt_mvmt.isNotNull()).count()

Solution 3

To obtain entries whose values in the dt_mvmt column are not null we have

df.filter("dt_mvmt is not NULL")

and for entries which are null we have

df.filter("dt_mvmt is NULL")

Solution 4

There are multiple ways you can remove/filter the null values from a column in DataFrame.

Lets create a simple DataFrame with below code:

date = ['2016-03-27','2016-03-28','2016-03-29', None, '2016-03-30','2016-03-31']
df = spark.createDataFrame(date, StringType())

Now you can try one of the below approach to filter out the null values.

# Approach - 1
df.filter("value is not null").show()

# Approach - 2
df.filter(col("value").isNotNull()).show()

# Approach - 3
df.filter(df["value"].isNotNull()).show()

# Approach - 4
df.filter(df.value.isNotNull()).show()

# Approach - 5
df.na.drop(subset=["value"]).show()

# Approach - 6
df.dropna(subset=["value"]).show()

# Note: You can also use where function instead of a filter.

You can also check the section "Working with NULL Values" on my blog for more information.

I hope it helps.

Solution 5

isNull()/isNotNull() will return the respective rows which have dt_mvmt as Null or !Null.

method_1 = df.filter(df['dt_mvmt'].isNotNull()).count()
method_2 = df.filter(df.dt_mvmt.isNotNull()).count()

Both will return the same result

Share:
411,848
Ivan
Author by

Ivan

Data Scientist, Systems and Big Data Architect, Physicist

Updated on March 22, 2022

Comments

  • Ivan
    Ivan about 2 years

    I'm trying to filter a PySpark dataframe that has None as a row value:

    df.select('dt_mvmt').distinct().collect()
    
    [Row(dt_mvmt=u'2016-03-27'),
     Row(dt_mvmt=u'2016-03-28'),
     Row(dt_mvmt=u'2016-03-29'),
     Row(dt_mvmt=None),
     Row(dt_mvmt=u'2016-03-30'),
     Row(dt_mvmt=u'2016-03-31')]
    

    and I can filter correctly with an string value:

    df[df.dt_mvmt == '2016-03-31']
    # some results here
    

    but this fails:

    df[df.dt_mvmt == None].count()
    0
    df[df.dt_mvmt != None].count()
    0
    

    But there are definitely values on each category. What's going on?

    • Atorpat
      Atorpat over 4 years
      You actually want to filter rows with null values, not a column with None values. The title could be misleading.
    • Richard Gomes
      Richard Gomes almost 4 years
      In a nutshell, a comparison involving null (or None, in this case) always returns false. In particular, the comparison (null == null) returns false. Also, the comparison (None == None) returns false.
  • Ivan
    Ivan almost 8 years
    Awesome, thanks. I thought that these filters on PySpark dataframes would be more "pythonic", but alas, they're not. I'm thinking on asking the devs about this.
  • zero323
    zero323 almost 8 years
    Actually it is quite Pythonic. You should never check __eq__ with None ;) And is wouldn't work because it doesn't behave the same way.
  • David Arenburg
    David Arenburg over 6 years
    Strangely this only works for string columns... It seems like df.filter("dt_mvmt is not NULL") handles both.