Total pyspark
noob here. I have a dataframe similar to this:
df = spark.createDataFrame([
Row(ttype='C', amt='12.99', dt='2024/01/01'),
Row(ttype='D', amt='21.99', dt='2024/02/15'),
Row(ttype='C', amt='16.99', dt='2024/01/21'),
])
I want to find the average amt
for the last 30, 60 and 90 days for ttype='C'
and ttype='D'
I can do this one at a time but I am looking for a more elegant way to do this.