Search code examples
postgresqlsql-insertaudit-logging

How can I properly update value of my row after it has been updated


I'm trying to track how my data are changed,

I am working with custom solution because I will track those changes only when I run the script.

What is most important to me is to have previous_value and value_after_update

INSERT INTO audit_details 
    (updated_table, updated_column, query, value_before_update, value_after_update)

SELECT 'some_table', 'some_column', query, enabled, (Here I've tried to put same query as it's contained on update because that value will be there after update),
FROM some_table WHERE product_id = _new_product_id;


UPDATE some_table
SET enabled = CASE WHEN (SELECT 1 FROM some_table WHERE enabled = true AND product_type = T1.product_type AND product_id = ANY(_previous_product_ids)) IS NOT NULL
    THEN true
    ELSE enabled END,
    THEN true
    ELSE accepted END
WHERE product_id = _new_product_id;

As you can notice guys, problem is that I don't know how can I properly insert also values_after_update, I should somehow repeat block from update into my insert into above (I have tried it even If I couldnt run it, it was big and ugly XD) ?

Any kind of help would be awesome!

Thanks

Cheers


Solution

  • You can do that with one single statement chaining multiple CTEs, that captures the old value and new value before and after the UPDATE.

    Then you can "iterate" over the columns by converting the rows to JSONB and extracting the changed values.

    with old_data as (
      -- collect the old values
      select id, to_jsonb(t) as old_value
      from some_table t
      where product_id = 100
    ), new_data as (
    
      -- this is where the actual UPDATE is done
      -- change the SET part as you need it
      update some_table
         set enabled = false,
             some_value = 4
      where product_id = 100
      -- this returns the modified values from the CTE
      returning id, to_jsonb(some_table) as new_value
    
    ), changed_column_values as (
      -- this converts the JSON values into one row per column
      -- and selects those column values that have changed
      -- it is assumed that the column some_table.id is the primary key 
      select nd.id, x.*
      from new_data nd
        join old_data od using (id)
        join lateral (
          select nd.id, n.col as column_name, o.value as old_column_value, n.value as new_column_value
          from jsonb_each_text(nd.new_value) as n(col, value)
            join jsonb_each_text(od.old_value) as o(col, value) on o.col = n.col and n.value is distinct from o.value 
        ) x on x.id = nd.id
    )
    -- now insert the result of the previous comparison into the audit table
    insert into audit_details (updated_table, updated_column, query, value_before_update, value_after_update)
    select 'some_table', column_name, old_column_value, new_column_value
    from changed_column_values
    

    Online example


    While the above code works, it's pretty ugly and prone to errors just to catch the changes for a single UPDATE statement.

    I would use one of the many ready-made generic auditing solutions that can be found here or here or here and attach the generic trigger only to those table where you need it.

    Or extend the trigger function to check for e.g. a configuration setting and only store the changes if the property is set to e.g. true. Then if you want to "debug" your statements turn on the audit logging (set session ....) and turn it off afterwards. You can also create a function for that, similar to this one