The spirit of this post is to gather other people's solutions, discuss ways of improving the suggested solution, and track future ArcGIS capabilities as they evolve for this problem.
This all started because we had existing data in a GDB that contained a Relationship Class (RC) between a Feature Class (FC) and a Table (TBL) using the FC's GlobalID field. We wanted to move that data into a new GDB that had a new schema (changes in domains, fields, etc.).
The problem is that when you Append the old data into the new schema, the GlobalID in the new schema is different than the records from the old data. This breaks the relationship between the FC and the related TBL.
UPDATE:
I have a much, MUCH, better solution that I discovered and it doesn't appear to be documented anywhere. All the text below that has strikethrough you shouldn't pay attention to, as it was the old, convoluted solution.
New Solution:
Pay no attention:
After doing some searching, I discovered that there is a way to do this using and Enterprise Geodatabase (EGDB) and ArcGIS Pro.
I post my basic workflow here on how to preserve the FC's GlobalID values so that when you migrate the data over to the new schema, the GlobalID values stay the same in the new FC. This preserves the relationship between the FC and the TBL in the new schema.
Assumptions:
Manual Steps:
At this point, you can now copy/paste that new FC back to your location of choice, and rebuild the RC so that it connects up with the new TBL. It turns out that the GUID's used in the related table to relate back to the FC are naturally preserved by using the Append tool in ArcCatalog, so performing the workflow above on the related TBL is unnecessary. Even though the TBL's GlobalID (not GUID) values change when moving the data, that's doesn't matter to us because they aren't used to create the relationship.
We don't do this often so we aren't going to take efforts to automate it, but I assume that might be possible.
My ObjectID field are showing unique as well. My target dataset was created as a blank FC on the sde. My input dataset was from AGOL. I did eventually download it from AGOL with the same results of Append failing. When I look at the index for the AGOL dataset the Global is set to "Yes" for unique. When I look at my target the GlobalID is set to "No" and I had to manually create the new index and set unique to "Yes". I'm assuming the append must be looking at this index for the GlobalID field and was failing because the settings between the two indexes did not match up between the two classes.
I just tried this in Pro, and it worked with no index at all for GlobalID. Indexing is only used for speeding up queries so it makes sense that an index shouldn't actually be required in order to append data to a field. I tried adding a GlobalID field in ArcCatalog, and it also generates an index for the field with Unique marked as 'No'. I really don't think the index has anything to do with it. Are you using File Geodatabases, SDE geodatabases, or Personal Geodatabases?
SDE geodatabases
I've been doing this in File Geodatabases, so I just tried this out in SDE with all 3 scenarios of index, and they all worked:
I can't replicate the issue with index scenarios. It would be interesting to see if you can create a new FC in SDE, add GlobalIDs, and then append the data again for each of those indexing scenarios I just outlined above.
If yours works with the "index, Unique = Yes", and not the other two below it, that would be an interesting result. I would then repeat the experiment with a File Geodatabase target instead of SDE.
But if you can successfully preserve GlobalIDs across all 3 index scenarios in SDE, then my guess would be that something was corrupted with the original feature class index you witnessed the non-preservation of GlobalIDs with, and that by deleting the original index and recreating it, the corruption disappeared along with the bad index, and it has nothing to do with "Unique = Yes/No".
If you're happy with the result you have now, by no means feel you need to satisfy my curiosity about the scenario. My motivation is find where the flaw is. It's how I roll. 😉
I will give it a shot and report back when I get chance.
So the only scenario I could get to work correctly was by manually creating an index with GlobalID set to 'unique' within the SDE. For some reason when I try it in a File gdb, It wont allow my to create the index with the GlobalID set to 'unique'. There must be something going on with the data I am trying to backup and how the GlobalID field are being created. It seems that it is the only thing causing an issue when trying to preserve the GlobalID for the incoming data and it wants to be extra touchy.
I'll have to keep messing around to see if I can narrow it down any further.
Thanks a lot for this! I got stuck on it today. Append just fails right away without it.
This post is over 2 years old and this is still broken!
I heard from another team here this has been a ArcMap bug since 10.0 and was fixed in 10.6.1 (so a long time). Seems to still be in Pro however.
This is great stuff!
Alex Friant I love your solution and updated post, and tried those steps to a T, but still got the generic 999999 error. Have preserve Global IDs checked, changed field to GUID, the field is already indexed, not unique. Appending from one file GDB to another, using Pro. Any thoughts on anything else I can do?
I couldn't replicate those index related issues, however, did you try Kyle Kaskie's solution from higher up in this thread? I quote Kyle:
UPDATED:
My solution turned out I needed to add an index to my archived feature class (Target Dataset) as the GlobalID field and set as unique. The append will run without making adjustments to the field map.