Hi there -
I'm trying to figure out a way to update the schema of an archived feature class. There was a lot of discussion about this from 2012 and prior. Re: Beta 10: Archiving ID's < -- This discussion in particular. Following the links in this discussion led to quite a few work-arounds, but all of them in 9.2 or 10.0. I'm working in 10.2.2. Has this issue been resolved and I'm just missing something? Is there a simple way to do this now? It is also worth mentioning that my organization does not use versions.
In the referenced discussion above, Russell Brennan mentions that "There is currently an enhancement request in our system to implement this functionality: NIM050307 - Option to attach the old archive table (_h) back to the original feature class on re-enabling archiving." Was this done?
I found this post Restoring Geodatabase History | Geodatabase Geek and they has was seems to be the most reliable work-around. A python script they created is mentioned, but apparently no way to do this entirely in python, even in 10.1. The enabling and disabling of the archive must be done in Arc.
Overall, my question is, has there been an update to do this process easily? I will have to do this from time to time and would like to maintain the archive tables as I work in the fire industry and fire history is extremely important.
Take a look at the webhelp from 10.2.2.
Schema changes made to the dataset or object class are automatically pushed to the archive class. For example, if you add an attribute column to the feature class, that column is automatically added to the archive class. If you delete an attribute column, that column will also be removed from the archive class, removing all the archived information for this column.
Thank you for commenting. I've read the help section about this topic quite a few times. This method is fine for quick schema changes. However, all my services create lock files within the geodatabase, so I have to shut these off in order to to make the changes. This disrupts client access. So I'm looking for a way around shutting the services down for long periods of time, if there is an update.
Thank you for commenting. I do mean map services. But we use many feature services as well. So anytime our services are running, all of our geodatabases are locked. This is the principle they must work on to maintain data integrity with multiple users accessing a database - I know that stopping the services is unavoidable. However, ArcMap makes schema editing quite difficult so exporting the schema to an application like diagrammer is required. So because I cannot do many schema updates within ArcMap, my archive tables are left orpahned when I import the new schema into the gdb or sde. A manual copy and paste with some editing to the archive tables is ok when we do not have years of data in there.
Have you tried disabling the schema lock on the map service? This would allow you to make changes to the schema in ArcMap.
However, if it is a requirement that you have schema locking turned on, I don't believe you will be able to make a schema change without removing those locks first.
Hopefully that helps!
I have no problems with updating while the locks are off. I know that editing the schema or database in anyway while the locks are on is not possible, so I'm just looking to minimize the time that those databases are shut down. To accomplish that I need to do the majority of any schema update beforehand so that I can just copy in the new schema, remap the data, and link the archive tables while our services are down.
There doesn't seem to be a clear cut way to do this. I've asked this question in two other geocommunities and it seems that linking an orphaned archiving table is just not an easily accomplished task. I believe that the manual work around detailed in this article, Restoring Geodatabase History | Geodatabase Geek, is really the best way to keep our archive records consolidated and accessible.
I know this issue doesn't come up too often, so thank you to everyone who contributed!