I had a situation where an external process was doing updates on a million row table, but only
a few dozen were actually changed (plus a handful were deleted and a score added). I couldn't
use a trigger, so I created a parallel table containing the primary key (a string with a GUID in it),
and a perfect hash (aka 'digest') of the dozens of active columns. Then I could create a second
hash table with the "revised" contents, compare the hashes to generate lists of added, deleted,
and changed rows, and then update the "current" hash table based on the needed action. This
was very successful to generate a low-bandwidth change feed to keep an external database in
sync; you could use something like this to track deltas between subsequent passes.
- V