Hi Vince,
I'm not sure what you mean by 'sdequery' though I'll pass this information by my developers. Appreciate the help. So will the 'sdequery' help determine if the upgrade was successful when confirming that the data went from single to double precision?
I'm trying to find from an analyst perspective what is the best method to test the migration of data to make sure that nothing was lost in the upgrade.
Your thoughts?
Thanks
Jennifer
'sdequery' is a custom command-line utility available within the se_toolkit suite (the link
for which was broken during transition).
I think you're mis-understanding the significance of a HIGH-precision coordinate reference.
There is no "single to double precision" conversion. ArcSDE has always represented
coordinate data in double precision, and that has not changed. The only change is that
previously, when coordinates were encoded in 32-bit integers (31 bits), now they are
encoded in 64-bit integers (54 bits [the size of a double mantissa]). This does not change
the precision of the data, since increasing available bits does not improve accuracy.
The only way to test if anything was lost is to do a rigorous regression test. One way
to do that would be to query each table (source and destination) with
sdequery -t tabname +DIGEST tabname.dig,KEY=keycol -N
The trick here is that you need a non-OBJECTID key column in each table. If you
preserve the objectid as 'origid' and eliminate the new rowid column from the column
list (-C flag) you could validate the row content, but only for simple feature classes --
ArcSDE API apps will not honor (or even see) relationship classes or feature datasets
or other ArcObjects behaviors.
In order to compare geometries between instances, you would need to utilize a
coordinate reference XYSCALE that was an even multiple of the previous scale
(if the old xyscale was 1000000, use 10000000 or 40000000, not 1111948722.22
or whatever the odd-ball value is used by default by Desktop), since the tiny
variance of division by xyscale will alter the double geometry extent (used by
the default PARTIAL comparison).
- V
Thanks Vince,
Right, the precision coordinate value doesn't increase accuracy it simple provides a higher resolution of the coordinate grid. So features can be as close a nanometers instead, previously I believe it was 2 cm.
So recommended testing steps moving forward then would be to utilize the sdequery utility to test to 9.1 tables and the 10.1 tables using the query you noted above. All we would be testing is simple feature classes, non of them participate in relationships, etc. Question on the non-OBJECTID key, would be put in this non-OBJECTID column manually before the transfer to 10.1? I see where you are going with this.....
On the coordinate reference scale side of things....where would we put in this multiplier?
Appreciate this help Vince