Is it a good idea to check if any changes will happen before attempting a commit to the database with python’s pymysql when a large number of updates will potentially happen in a single query or is it a relative waste of time?
So instead of doing something like this:
<code>import pymysql
db = pymysql.connect(xxxx)
cur = db.cursor()
sql = "update TABLE set A = 'abc' where B = 'def'"
cur.execute(sql, params)
db.commit()
</code>
<code>import pymysql
db = pymysql.connect(xxxx)
cur = db.cursor()
sql = "update TABLE set A = 'abc' where B = 'def'"
cur.execute(sql, params)
db.commit()
</code>
import pymysql
db = pymysql.connect(xxxx)
cur = db.cursor()
sql = "update TABLE set A = 'abc' where B = 'def'"
cur.execute(sql, params)
db.commit()
Doing something like this:
<code>import pymysql
db = pymysql.connect(xxxx)
cur = db.cursor()
sql = "update TABLE set A = 'abc' where B = 'def'"
affected_rows = cur.execute(sql, params)
if affected_rows > 0:
db.commit()
</code>
<code>import pymysql
db = pymysql.connect(xxxx)
cur = db.cursor()
sql = "update TABLE set A = 'abc' where B = 'def'"
affected_rows = cur.execute(sql, params)
if affected_rows > 0:
db.commit()
</code>
import pymysql
db = pymysql.connect(xxxx)
cur = db.cursor()
sql = "update TABLE set A = 'abc' where B = 'def'"
affected_rows = cur.execute(sql, params)
if affected_rows > 0:
db.commit()