So currently I’m using SQL server for our structured data, client uploads a file which has a minimum of a million of records, which gets uploaded to blob storage and then those million of records gets inserted into different tables in SQL. What I want to know is can I completely move this architecture to Data Lake? does Data Lake supports transactions?
I also have web-jobs and function apps in Azure and the plan is to convert the function app to PySpark jobs as well, so in short everything will be moved from Azure to Fabric, but I did not find any article which supports this and I don’t know what impact this is going to have on performance.
Since we are getting millions of records from a single customer, so client is pushing us to move towards DataLake.
I would like to know thoughts on this.
Thanks!
Researched and read many articles
Jayesh Nathani is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.