Is normal that a backup of my project using pyspark takes too long? I have tables with more than 200 milion rows, and I’m investigating why is taking so long, since for only 10 tables, has uploaded in almost 4 hours.
Thank you.
I tried already upload with compressions (snappy and gzip), and with the format parquet and csv.
New contributor
TMiles is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.