Apache Spark subtract days from timestamp column

10,655

Solution 1

You cast data to timestamp and expr to subtract an INTERVAL:

import org.apache.spark.sql.functions.expr

val df = Seq("2017-09-22 13:17:39.900").toDF("timestamp")

df.withColumn(
  "10_days_before", 
  $"timestamp".cast("timestamp") - expr("INTERVAL 10 DAYS")).show(false)
+-----------------------+---------------------+
|timestamp              |10_days_before       |
+-----------------------+---------------------+
|2017-09-22 13:17:39.900|2017-09-12 13:17:39.9|
+-----------------------+---------------------+

If data is already of TimestampType you can skip cast.

Solution 2

Or you can simply use date_sub function from pyspark +1.5:

from pyspark.sql.functions import *

df.withColumn("10_days_before", date_sub(col('timestamp'),10).cast('timestamp'))
Share:
10,655
datahack
Author by

datahack

Updated on June 15, 2022

Comments

  • datahack
    datahack almost 2 years

    I am using Spark Dataset and having trouble subtracting days from a timestamp column.

    I would like to subtract days from Timestamp Column and get new Column with full datetime format. Example:

    2017-09-22 13:17:39.900 - 10 ----> 2017-09-12 13:17:39.900
    

    With date_sub functions I am getting 2017-09-12 without 13:17:39.900.

  • CJR
    CJR about 4 years
    Recently discovered that date_sub() returns a date, not a timestamp, and so you'll run into unexpected rounding.