I’m working on a PostgreSQL trigger function to log changes to a data_sync_datahistory
table whenever there’s an INSERT
or UPDATE
operation on other tables. The trigger is set up to only log these changes when the application_name
is set to 'dibpos_offline'
. However, I’m encountering an issue where some records are not being inserted into the data_sync_datahistory
table when multiple records are inserted into the database.
Here is my trigger function:
IF current_setting('application_name') = 'dibpos_offline' THEN IF TG_OP = 'INSERT' THEN INSERT INTO data_sync_datahistory (created, modified, data, source, table_name) VALUES (CURRENT_TIMESTAMP, CURRENT_TIMESTAMP, ROW_TO_JSON(NEW), 'local', TG_TABLE_NAME); ELSIF TG_OP = 'UPDATE' THEN INSERT INTO data_sync_datahistory (created, modified, data, source, table_name) VALUES (CURRENT_TIMESTAMP, CURRENT_TIMESTAMP, ROW_TO_JSON(NEW), 'local', TG_TABLE_NAME); END IF; END IF; RETURN NEW;
And here is the relevant part of my database configuration:
DATABASES = { 'default': { 'NAME': os.environ.get('DB_NAME', "postgres") if os.environ.get("DB_HOST", None) else os.path.join(Path.home(), 'dibpos.sqlite3'), 'USER': os.environ.get('DB_USER', "postgres"), 'PASSWORD': os.environ.get('DB_PASSWORD', '1234'), 'HOST': os.environ.get('DB_HOST', ''), 'PORT': os.environ.get('DB_PORT', 5432), 'CONN_MAX_AGE': None, 'CONN_HEALTH_CHECK': True, 'OPTIONS': { 'options': '-c application_name=dibpos_offline' }, } }
Problem: When multiple records are inserted into the database, some of them are not being logged into the data_sync_datahistory
table. The trigger function seems to be missing some records.
Questions:
-
Could the issue be related to the database configuration or the trigger function itself?
-
Are there any common pitfalls or things to check when using triggers with
application_name
settings? -
Is there a way to ensure that all records are logged consistently?
Any advice or suggestions on how to troubleshoot this issue would be greatly appreciated!