I have a SQL Server code snippet where the store proc. uses the DATEADD function to add minutes to a timestamp. Here’s the code:
DATEADD(minute, no_of_minutes_to_add, timestamp_column) AS new_timestamp
My requirement is to convert this store proc code into pyspark sql code.
I need to run the same code in Spark SQL. However, while Spark SQL does have a DATE_ADD function, it only allows adding a number of days, without a parameter to specify minutes, days, or months.
What is the correct way to add minutes to a timestamp in Spark SQL?
Is there an equivalent function or a workaround to achieve this in Spark SQL? Any help would be appreciated. Thank you!