Is there a way to put multiple files in Pyspark dataframes to azure explorer container in one code ?
lets say i have following pyspark dataframes generated in my code.
df1= pyspark.sql("select * from ABC")
df2= pyspark.sql("select * from XYZ")
df3= pyspark.sql("select * from PQR")
would like to put these files from databricks to azure container in a specific folder.
I know using the below code we can place a file.
df1.coalesce(1).write.format('csv').mode('overwrite').save("FileStore/xyz/df_cleaned")
The above code will only place one file in df_cleaned folder. However would like to put all the files in df_cleaned (df1,df2,df3). Kindly suggest a suitable way to put all the files in the specific colder (df_cleaned).