optimizing the execution time for inserting a large volume of data into the database using Sequelize
Initially, I was running around 20,000 queries individually through Sequelize, but it was taking too much time. To improve performance, I started concatenating the queries and executing them in batches of 500 using Sequelize’s raw query function. While this approach helped to some extent, I’m still exploring alternative methods that could further enhance efficiency.
Specifically, I’m considering whether writing the queries to a file and then importing that file into the database through Sequelize could be a more efficient approach. However, I’m unsure about the feasibility and potential benefits of this method compared to the batch execution method I’m currently using.
Overall, my goal is to find the most efficient way to insert a large volume of data into the database using Sequelize, and I’m seeking insights or suggestions from the community on how to achieve this.
Adan Op is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.