ArcSDE 9.1 to ArcSDE 10.1 Documented Differences

1573
12
07-03-2014 06:59 AM
JenniferDick
New Contributor II
Hello,

The organization I work for is currently using ArcSDE 9.1 (we are aware that ESRI does not support this platform anymore) and we
are working towards upgrading to 10.1 this year. There are multiple reasons why we are still on 9.1.

However, what I'm interested in is a white paper or noted points about the striking differences between the versions (9.1 & 10.1), for example - double precision in 10.1 vs. single precision 9.1. This information is critical for knowledge transfer and understanding of the upgrade.

I'm looking for sources or points of information to fully comprehend what the upgrade will consist of. 

Appreciate any help!

Thank you in advance,
Jennifer
0 Kudos
12 Replies
VinceAngelo
Esri Esteemed Contributor
This isn't an "upgrade" because there is no possible transition path between 9.1 and 10.1
(I do hope the target is 10.1 SP1 with QIP).  You are, in effect, completely reloading your
spatial database into a completely different database.  As such, there is no documentation
on the changes (though it wouldn't hurt to read the 9.2, 9.3, 9.3.1, 10.0, 10.1, and 10.2
"What's New" documents to get an idea).

You will need to select a transfer mechanism (unfortunately, shapefiles and ASCII are
probably the only common data formats) and a geometry storage format, but there
has been so much evolution, you can just read up on the new ArcGIS planning docs.

- V
0 Kudos
JenniferDick
New Contributor II
Thank you very much,

I seem to be getting conflicting perspectives on the upgrade transition. We have built a complete new server environment, oracle, etcfor this project and because of this new environment we were informed it's possible to straight from 9.1 to 10.1.

Your thoughts?
0 Kudos
George_Thompson
Esri Frequent Contributor
As Vince mentioned there is no direct mechanism for a 10.1 client to connect to a 9.1 enterprise geodatabase. You at least would need to go to a 9.3.x release first then to 10.1 or newer.

Client and Geodatabase compatibility.
http://resources.arcgis.com/en/help/main/10.1/index.html#//003n00000008000000

You would also need to make sure that you have a supported RDBMS version for each database transition.

9.3.x - Oracle Requirements: http://downloads.esri.com/support/systemrequirements/arcsde_oracle_database_requirements.pdf
10.1 - Oracle Requirements: http://resources.arcgis.com/en/help/system-requirements/10.1/#/Oracle_Database_Requirements/01510000...

Hope some of this clears up any confusion.

-George
--- George T.
0 Kudos
JenniferDick
New Contributor II
Thanks George, appreciate the feedback. I'll have to follow up with my developers on this.
0 Kudos
JenniferDick
New Contributor II
Looking for a way to test and make sure that an upgrade to ArcSDE 9.1 to a supported version and then to 10.1 happened correctly and that no spatial integrity was lost.

Any best practices out there that someone could offer? What should I be look for other then, 32 to 53 bit comparisons, etc.

Thanks
Jennifer
0 Kudos
VinceAngelo
Esri Esteemed Contributor
The 'sdequery' utility of se_toolkit has a hidden "+DIGEST" option which was
created to generate row-level checksums in tables within ArcSDE geodatabases.  
The trick is locating a key column which isn't an SDE-set registered rowid, but
is otherwise unique across the table.  I successfully used this to detect a situation
where UTF-8 string data was corrupted by a character set issue n Oracle (comparing
10,000 pairs of small ASCII files is way easier than comparing 10,000 pairs of tables).

- V
JenniferDick
New Contributor II

Hi Vince,

I'm not sure what you mean by 'sdequery' though I'll pass this information by my developers. Appreciate the help. So will the 'sdequery' help determine if the upgrade was successful when confirming that the data went from single to double precision?

I'm trying to find from an analyst perspective what is the best method to test the migration of data to make sure that nothing was lost in the upgrade.

Your thoughts?

Thanks

Jennifer 

0 Kudos
VinceAngelo
Esri Esteemed Contributor

'sdequery' is a custom command-line utility available within the se_toolkit suite (the link

for which was  broken during transition).

I think you're mis-understanding the significance of a HIGH-precision coordinate reference.

There is no "single to double precision" conversion.  ArcSDE has always represented

coordinate data in double precision, and that has not changed.  The only change is that
previously, when coordinates were encoded in 32-bit integers (31 bits), now they are

encoded in 64-bit integers (54 bits [the size of a double mantissa]).  This does not change

the precision of the data, since increasing available bits does not improve accuracy.

The only way to test if anything was lost is to do a rigorous regression test.  One way

to do that would be to query each table (source and destination) with

sdequery -t tabname +DIGEST tabname.dig,KEY=keycol -N

The trick here is that you need a non-OBJECTID key column in each table.  If you

preserve the objectid as 'origid' and eliminate the new rowid column from the column
list (-C flag) you could validate the row content, but only for simple feature classes --
ArcSDE API apps will not honor (or even see) relationship classes or feature datasets
or other ArcObjects behaviors.

In order to compare geometries between instances, you would need to utilize a

coordinate reference XYSCALE that was an even multiple of the previous scale

(if the old xyscale was 1000000, use 10000000 or 40000000, not 1111948722.22

or whatever the odd-ball value is used by default by Desktop), since the tiny
variance of division by xyscale will alter the double geometry extent (used by

the default PARTIAL comparison).

- V

JenniferDick
New Contributor II

Thanks Vince,

Right, the precision coordinate value doesn't increase accuracy it simple provides a higher resolution of the coordinate grid. So features can be as close a nanometers instead, previously I believe it was 2 cm.

So recommended testing steps moving forward then would be to utilize the sdequery utility to test to 9.1 tables and the 10.1 tables using the query you noted above. All we would be testing is simple feature classes, non of them participate in relationships, etc. Question on the non-OBJECTID key, would be put in this non-OBJECTID column manually before the transfer to 10.1? I see where you are going with this.....

On the coordinate reference scale side of things....where would we put in this multiplier? 

Appreciate this help Vince

0 Kudos