Copy activity – File is having 3 columns but Sink Table is having 5 columns – ADF
I have source file with different record layouts and trying to load the data into same table in Synapse Analytics – dedicated sql spool. I am trying to use Dynamic Column mapping approach but getting an error
Auditing Pipleines in Azure Data Factory
I recently started a Data Engineering role and I was tasked with finding a way to audit our existing pipelines in Azure Data Factory. The requirements are to get all existing components of the pipelines, when they run, if they ran successfully and if any rows were added/deleted to databases after the pipeline ran.
Get the Item value from the foreach activity value if the item value/Folder_Name contains .csv child items
I want to use foreach activity and want to use the item value passed to each foreach.
First it will use the item value to get the metadata of the folder from the parameterized adls location. If the Adls location contain childitems, i want to use append variable to append the item value. And use the append variable value for further processing
ADF UserErrorFileNotFound using partitions
We are encountering an issue with the Copy activity in Azure DataFactory. Our data source is Oracle, and we use dynamic range partitioning. The data is then saved as parquet files.
If condition in set variable is throwing an error azure data factory
I am extracting href from a link to get the url for pagination, i am storing an api response in blob as json. Using lookup activity i am reading the json and set a variable for next pagination url. The below logic is working fine when file has a pagination and url is in a link array. But when there is no pagination url present in a file then link is not an array and i am getting this error
Rename and reset a file creating a ZIP in Azure Data Factory
I’m banging my head trying to do the following (I don’t even know if it’s possible): create a ZIP, based on a list of files (using “List of Files” option) and, at the same time, rename a file and store it in a different folder, using Azure Data Factory.
Move files from SQL nvarchar to Azure Blob Storage and update URLs using Azure Data Factory
How can I use Azure Data Factory to move files stored as nvarchar in a SQL table to Azure Blob Storage and then update the corresponding rows with the new Blob URLs?
How to run a pipeline on Weekly basis in ADF
I have a requirement to run a pipeline only on weekends that is Saturday and Sunday. I used schedule based trigger and apply start date and end date but there is no option I am able to see through I can pipeline on weekend
Incrementally copy new files by LastModifiedDate with Azure Data Factory
[enter image description here] (https://i.sstatic.net/MBwbT11p.png)
How to write data to share point list using Azure Data factory
I have a requirment where I need to write data to share point list file using azure data factory , could you please some one help here ?