i am creating GCP dataflow jobs via flex template, using cloudbuild to generate templates etc
This results in brand new buckets being created every single time. eg i have
a dataflow-staging-us-central1-bcc13063024968bf8d7e6420b45af926 which plenty of directories
i have atleast 3-4 other directories like that
What are best practices for cleaning these buckets?
I have a cloudbuild trigger that gets activated every time there is a commit to my repo
this trigger will try to build flex templates for all the gcp jobs i have
Perhaps what i am doing is not best practice?
Thanks and regars
Marco