I have a very simple pipeline in Azure Data Factory:
It consists of 2 consecutive Copy data activities:
- Fetch a JSON file from an API, and store in a Data Lake with a file name containing the current date and time. The sink file name for the first activity is
@concat('parking_', formatDateTime(utcnow(), 'yyyy-MM-ddTHH:mm'), '.json')
- Load the recently fetched JSON file as sink and insert it into a SQL database
How can I save or output the filename so that it can be specified in and used by the last Copy data
activity? If I use the same function for specifying the file name, it might fail, as the function specifying the file name might run at a different time.
How can I save or output the filename so that it can be specified in and used by the last Copy data activity?
The easiest way to achieve this is by using event driven trigger for file create d in blob storage.
- First create a Storage event trigger for file create in blob storage.
- Create another pipeline which will transfer the latest Json file in SQL database and create pipeline parameter as filename in it.
- In this pipeline in source dataset add dataset parameter as filename.
- Add above pipeline parameter for filename in the dataset properties.
- Add triggered to this pipeline which we created earlier.
- Add trigger run parameter to get the filename from trigger as below
@triggerBody().fileName
.
And after first pipeline run file get stored in blob storage and then this trigger will trigger second pipeline which store the recently added Json files data to SQL database.