I have a FastAPI endpoint in python that needs to insert a few thousand rows into a SQL Server table. I have the data in a pandas dataframe and currently I’m looping over the dataframe and inserting them individually. Is there a way to do a bulk insert as this process is timing out the endpoint as it takes too long. I can’t use SQLalchemy as the FastAPI script is on an Azure server and SQLalchemy crashes the server.
I have tried SQLalchemy but the library crashes the Azure server the FastAPI script is on.
<code>for _,row in new_df.iterrows():
cursor.execute(f"""
insert into ras_import_temp(account_no,amount,[description],eff_date,reference)
values (?,?,?,?,?)
""",row.to_list())
cursor.commit()
</code>
<code>for _,row in new_df.iterrows():
cursor.execute(f"""
insert into ras_import_temp(account_no,amount,[description],eff_date,reference)
values (?,?,?,?,?)
""",row.to_list())
cursor.commit()
</code>
for _,row in new_df.iterrows():
cursor.execute(f"""
insert into ras_import_temp(account_no,amount,[description],eff_date,reference)
values (?,?,?,?,?)
""",row.to_list())
cursor.commit()