ETL process does not update SDE table because of Object ID

298
2
08-04-2011 07:58 AM
Sri_VarunMuthyala
New Contributor II
Hello,

I'm running to into couple of road blocks while creating an automated process of updating data from one database into another and then register it with GIS and geocoding.

�?� We have a Corporate Database which gets updated every month and the new versions of the data from corpDB is distributed into various other DB's every month (It�??s not on a certain date and Time) via ETL process.

�?� This process is not sending updates it�??s basically deleting everything and adding it again. so from these tables i would like to create a view, register and geocode it dynamically (I'm not able to do this, because dynamic geocoding does not work with a view it works with a table)

�?� So when I create another table exactly similar to view and register it with GDB and dynamic geocoding and when a new record is added to this table (Manually) the dynamic geocode works. But when the ETL process is updating the table with new records it fails because of Object ID field.

Is there any possible solution that we can automate this process?
0 Kudos
2 Replies
VinceAngelo
Esri Esteemed Contributor
I had a situation where an external process was doing updates on a million row table, but only
a few dozen were actually changed (plus a handful were deleted and a score added).  I couldn't
use a trigger, so I created a parallel table containing the primary key (a string with a GUID in it),
and a perfect hash (aka 'digest') of the dozens of active columns.  Then I could create a second
hash table with the "revised" contents, compare the hashes to generate lists of added, deleted,
and changed rows, and then update the "current" hash table based on the needed action.  This
was very successful to generate a low-bandwidth change feed to keep an external database in
sync; you could use something like this to track deltas between subsequent passes.

- V
0 Kudos
BruceHarold
Esri Regular Contributor
Hi Sri

If you have the Data Interoperability extension you can adopt a workflow where only changes are written to the target table (the source table for the dynamically geocoded feature class).

It would work like this:  The source DBMS table is read into the workspace and so is the target table.  Records are compared with the ChangeDetector transformer and added and deleted records identified (updates will show as new records from the source and unknown records from the target, versus the source table of record).  Assign a feature attribute named "fme_db_operation" values of INSERT or DELETE and write to the target geodatabase table; this will write the deltas and trigger the geocode.

Regards
0 Kudos