Is there a way to store a dictionary as a column value in pyspark?
I have two spark dfs containing different number of columns with the 1st column being the ID (for both). I want to have the column values for each ID as a dictionary (the visual would make better sense of what I am trying to achieve)