I have a blob storage container where i have folder and subfolder like structure example:
Container: XYZ
Folder: yyyy-MM-dd
subFoller: HH
There is one folder created for every day and in that hourly folder getting created. I am getting JSON files for teams data in the folder.
Now i want to create a pipeline and load JSON data near to real time or lets say every 10-15 minutes.
i have created a copy activity and in source i have serverless pool to query JSON for selected column data. and target is a table on dedicated sql pool.
This copy activity is working fine independently for hardcoded file location.
However i need to implement a logic in order to load every 10-15 mins of files incrementally that is too with dynamic folder structure as mentioned above.
I have created one pipeline but i am getting following error:
Error
Notifications
{
“code”: “BadRequest”,
“message”: null,
“target”: “pipeline//runid/xxxxxxxxxxxxxxxxxxxxx”,
“details”: null,
“error”: null
}
I am getting error on Get Metadata while fetching the required details.
3
{ “code”: “BadRequest”, “message”: null, “target”: “pipeline//runid/xxxxxxxxxxxxxxxxxxxxx”, “details”: null, “error”: null }
The above error notification you will get when JSON payload sent to management.azure.com
is corrupt or you are sending expression to the parameter which is not supported.
While executing Get metadata activity pass the dynamic expression directly to the dataset properties. when you set it as default value in pipeline parameter it will take this as string not as expression.
And to get over your error notification check all the input values and the dynamic expression you are using has correct value and check it get passed correctly.