Pandas write parquet files to S3 has partition limit of 1024
I have a pandas dataframe which I am writing to S3 using Pyarrow engine. I have the data to be partitioned by Pyarrow engine throw error that more than 1024 partitions can not be written. Is there a way to overcome this limitation?