Drop spark dataframe from cache

79,884

Solution 1

just do the following:

df1.unpersist()
df2.unpersist()

Spark automatically monitors cache usage on each node and drops out old data partitions in a least-recently-used (LRU) fashion. If you would like to manually remove an RDD instead of waiting for it to fall out of the cache, use the RDD.unpersist() method.

Solution 2

If the dataframe registered as a table for SQL operations, like

df.createGlobalTempView(tableName) // or some other way as per spark verision

then the cache can be dropped with following commands, off-course spark also does it automatically

Spark >= 2.x

Here spark is an object of SparkSession

  • Drop a specific table/df from cache

     spark.catalog.uncacheTable(tableName)
    
  • Drop all tables/dfs from cache

     spark.catalog.clearCache()
    

Spark <= 1.6.x

  • Drop a specific table/df from cache

     sqlContext.uncacheTable(tableName)
    
  • Drop all tables/dfs from cache

     sqlContext.clearCache()
    
Share:
79,884
ankit patel
Author by

ankit patel

Updated on July 09, 2022

Comments

  • ankit patel
    ankit patel almost 2 years

    I am using Spark 1.3.0 with python api. While transforming huge dataframes, I cache many DFs for faster execution;

    df1.cache()
    df2.cache()
    

    Once use of certain dataframe is over and is no longer needed how can I drop DF from memory (or un-cache it??)?

    For example, df1 is used through out the code while df2 is utilized for few transformations and after that, it is never needed. I want to forcefully drop df2 to release more memory space.

  • axlpado - Agile Lab
    axlpado - Agile Lab over 8 years
    And pay attention to unpersist the df after the end of the lineage, so after the last action that involves the cached df.
  • spacedustpi
    spacedustpi almost 4 years
    I tried this for one of my dataframes "df" and when I did df.show(), df was still displaying data. When does it actually unpersist?
  • spacedustpi
    spacedustpi almost 4 years
    I tried these for my RDD 'df'. Why does type df.show() still display data?
  • mrsrinivas
    mrsrinivas almost 4 years
    df.show() will display data irrespective of cache, as long as the input source for the data frame is available.
  • Itération 122442
    Itération 122442 about 2 years
    @spacedustpi it removes the dataframe from the cache. (Somewhere in memory or on disk if not enough space in memory) By calling show, you triggered an action and then the computation has been done from the beginning to show you the data.