I have a very simple pipeline in Azure Data Factory:
It consists of 2 consecutive Copy data
activities:
- Fetch a JSON file from an API, and store in a Data Lake with a file name containing the current date and time. The sink file name for the first activity is
@concat('parking_', formatDateTime(utcnow(), 'yyyy-MM-ddTHH:mm'), '.json')
- Load the recently fetched JSON file as sink and insert it into a SQL database
How can I save or output the filename so that it can be specified in and used by the last Copy data
activity? If I use the same function for specifying the file name, it might fail, as the function specifying the file name might run at a different time.