We recently moved to a versioned database and enabled archiving at the same time so we can manage changes that may need to be rolled back due to user mistake. While this feature fullfilled one business requirement it totally misses on the maintenance side. We have over 400K instances of a particular class. Archiving all the changes to this class will grow the database significantly over time. We need the ability to purge data as it ages. There are no tools today to remove any historical records that are older than XX days, months, or years. We need to have a tool that will remove archived data that has an expiration date (GDB_TO_DATE) older than a specified date. The only way to accomplish this today is to run a python script that goes directly at the _H table and deletes these records and is apparently unsupported. An exmaple in SQL would be: delete from XXX_h where gdb_to_date < to_date('03/25/2013','MM/DD/YYYY').
Thanks for the idea! I believe that the workflow you describe can be achieved using the Trim Archive History geoprocessing tool. Check out the linked documentation and let us know if this meets your archiving needs.
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.