Dropping nested struct field from parquet file and updating schema using pyspark
I am new to python and pyspark, so need your help in understanding and resolving this problem. I have some parquet log files with the following schema:
I am new to python and pyspark, so need your help in understanding and resolving this problem. I have some parquet log files with the following schema: