I am using ArcGIS 10.1 SP1 and I have been trying to devise an indexing strategy for our new geodatabase. When implementing it, I am having problems making unique indexes and registering as versioned. Whichever order I use doesn't make a difference, the second tool won't allow unique indexes and registering as versioned.
I saw an older 9.3 article on this and it said you could do both if you register as versioned first, then create your unique indexes. That hasn't worked for me. Also, the ArcGIS 10 help said it doesn't recommend using unique indexes because of the potential for compress problems.
Is it a good idea to skip unique indexes and use Data Reviewer to check uniqueness later? I thought it would make sense to enforce uniqueness where it should apply.
There's two problems with unique indexes on versioned data: 1) The new data is added to the Adds table, so the integrity constraint wouldn't occur when needed 2) It's perfectly valid to have multiple unique values in the Adds table, so long as only one of them is eventually posted to the business table.
There's also a performance cost associated with unique indexes in general, so if you're not going to see a benefit from that cost, why pay it?
Thanks for that explanation. Do you think the article I saw was probably referring to manually building unique indexes on just the base tables? I couldn't build any unique indexes after versioning with the GP tools.