I am facing difficulties migrating data from a table with 6 million rows to another table in a SQL Server database due to resource limitations. The database is relatively small, and when I send 100,000 rows at a time, the DTU usage spikes to 100%, which compromises the performance and stability of the system.
I have tried several approaches for this migration, including:
Batch processing with Python
Using the INSERT SELECT command
Importing data from a CSV file to the destination table
However, all these approaches have resulted in database overload and have not been effective for a smooth migration.
What are the best practices or techniques to migrate large volumes of data without causing high resource utilization in the database? Is there an efficient way to break the migration into smaller steps without making the processing take too many hours, or optimize the process to minimize the impact on the database?
Rafaella Guimarães is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.