I recently started a Data Engineering role and I was tasked with finding a way to audit our existing pipelines in Azure Data Factory. The requirements are to get all existing components of the pipelines, when they run, if they ran successfully and if any rows were added/deleted to databases after the pipeline ran.
I am able to get most of the information from the pipeline and pipeline runs api but I am not sure how to find out what database tables were updated after the pipeline runs. For now I would just like to get something as basic as a count on each table.
Has anyone done something like this? What is the best approach to achieve this?
Thank you in advance for helping a noob out. Any help is appreciated :).