I know how to export a table from BigQuery to Google Cloud Storage (via console, SQL or bq). Of course this is time consuming if I have lots of tables to export.
So the question is plain and simple, but I haven’t found any solution for this: is there a way to export a whole dataset in one go? I’m looking for some kind of good old mysqldump method. I don’t care if the file destination is still GSC, I’m just looking for some kind of automated process so I don’t need to export every table individually. Since I need this for the Google Analytics data stored in BQ, the export format would still need to be JSON (because of the nested values).
Thanks.
Well, since I’m working with the Google Analytics data which is located in the tables named events_YYYYMMDD, one answer is obvious (didn’t think of it right away though). The steps are:
- Query data from all tables using the SQL statement like ‘SELECT * FROM events_*’
- Save the query results into a new table
- Download/export data from the new table to Google Cloud Storage as a JSON file (or other format if you preffer)
- Download the file from the GCS to your computer
The steps above work because I’m dealing with the tables that use the similar name and the same schema. Still, if there’s a better/quicker solution or if there’s more automated way to do this, please let me know.
Dealing with tables with different schemas would still need another approach. So if someone has the answer to the original question, please let me know.
2