SHAPE.area / length empty for sdo_geom data

6482
15
11-15-2012 09:15 PM
HaniuHokkaido
New Contributor III
hi,

I create a feature class from arccatalog using sdo_geom keyword. the feature class is created successfully and i can insert new data (via ArcMap by drawing it manually) and also i can view the data normally. But when I look  into shape.area and shape .length they are empty...

Why ?

Database: oracle 10.2.0.3, arcsde 10.0
0 Kudos
15 Replies
VinceAngelo
Esri Esteemed Contributor
There is another option: Ignore the useless linear degrees and square degrees values that
would have been returned for LENGTH and AREA with GEOGCS coordinate values. If these
attributes were meaningful in the context of the data (in meters, or km^2) they will have been
calculated and preserved elsewhere, and if they are needed, they would be needed the
same, no matter the storage format.

- V
0 Kudos
WesKing
New Contributor
Thanks to both of you!

Vince, can I ask for a little clarification on the "other option"; I don't understand how this would help.  Basically, since we are using SDO_GEOMETRY type the SHAPE.AREA/SHAPE.LEN fields are of no use to us...correct?

And since we are not "allowed" to add fields to the geometry table, it seems we would have to do as Marco stated.

Wes
0 Kudos
MarcoBoeringa
MVP Regular Contributor
We do use triggers on many fields and in the past used them on our own geometry fields.  We've changed things a lot lately to be in compliance with SDSFIE 3.0 standards and it has caused HAVOC!  Basically, now all the fields we used to have in spatial tables are now in stand-alone ancillary tables.  So our geometry fields are now in stand-alone tables...doesn't really make sense does it?


I just did a very brief and quick read-up on this standard as I am not familiar with it. To what extent are these changes really necessary? Physical (Geospatial Platform) constraints may limit to what is feasible or wise.

Funny, related to this, I encountered two contradicting figures in two official DoD PDFs:

http://www.fgdc.gov/participation/coordination-group/meeting-minutes/2011/april/sdsfie-update-cg-201...
http://www.acq.osd.mil/ie/bei/disdi/factsheet_sdsfie.pdf

Notice how the left figure shows a "Platform Dependent" implementation at the "Geospatial Technology Platform" level, while the right figure shows "Platform Independent" implementation. I do not immediately see how the implementation at that level could be "Platform Independent"...

This PDF that you undoubtedly know of is interesting in the context of (ArcGIS) implementation:

How Will I Get My Data into a SDSFIE Geodatabase?
http://proceedings.esri.com/library/userconf/proc03/p0108.pdf

EDIT:
Ah, well, now noticed this is actually a very old 2002 PDF dealing with migration from coverages / shapefiles to a Geodatabase, not really relevant... although it does give some insight as to what you are talking about and background to the history of it.

And these links seem more relevant and recent related to SDSFIE 3.0:
SDSFIE 3.0 Data Migration
http://www.acq.osd.mil/ie/download/disdi/presentations/SDSFIE3.0_Data_Migration.pdf

An overview of the SDSFIE v2.6 to v3.0 migration process
http://www.esri.com/esri-news/arcuser/spring-2013/an-overview-of-the-sdsfie-v26-to-v30-migration-pro...

ESRI Support for SDSFIE 3.0 Implementation
http://proceedings.esri.com/library/userconf/ieug09/papers/esri_support_for_sdsfie_nov2009_jay_cary....
0 Kudos
WesKing
New Contributor
Yeah, I see exactly what you're saying about the discrepancy, but I can't explain why.

Basically, we're provided the exact table structure for our spatial tables; specific to the decision makers decisions on how the tables should be structured.  We have no say in how what fields these tables will include.  They are designed to be specific for different feature classes though (i.e., buildings, natural resource surveys, power lines, etc. all have a custom fields but also have certain fields in common).  The ancillary tables that store the majority of attribute data is totally up to us how it is structured.  It's supposed to allow a certain amount of standardization (the spatial tables with a few "important" fields) in data storage, but also allow a large amount of customizability (is that a word?) (the ancillary tables with anything we decide is important.  And yes, it seems to be Arc specific.

I obviously gave the "short-and-sweet", but that's the gist of it.

Thanks again, I do appreciate your help.

Wes
0 Kudos
VinceAngelo
Esri Esteemed Contributor
All I'll say on this is that:
1) I understand why some folks want to standardize data capture
2) I've never seen a project be successful when they strive to meet
an arbitrary specification which can't be accurately populated -- The
success of a project should not be measured by the number of fields,
but by the ratio of the number of accurately completed cells to the
total number of cells.

- V
0 Kudos
MarcoBoeringa
MVP Regular Contributor
Basically, we're provided the exact table structure for our spatial tables; specific to the decision makers decisions on how the tables should be structured.  We have no say in how what fields these tables will include.  They are designed to be specific for different feature classes though (i.e., buildings, natural resource surveys, power lines, etc. all have a custom fields but also have certain fields in common).  The ancillary tables that store the majority of attribute data is totally up to us how it is structured.


Here in the Netherlands, we currently have a big project going on bearing similarities to the one in the U.S. It involves a standardization of large scale mapping (1:00 - 1:1000) across the entire country.

They have taken a slightly different path though. Instead of standardizing the actual implementation of spatial or attribute storage at any given sites (we're talking about a few hundreds at municipal, provincial and country level that use a variety of CAD / GIS systems), they set up a support organization / facility that will hold the main database. The point where standardization takes place is the exchange of new or updated data. All organizations are forced to comply to an exchange standard, and there are very stringent, mostly automated, checks on spatial (e.g. topology) and attribute checks before delivered data is accepted and entered into the main database. Any data that fails will be rejected and must be repaired.

From the main database, a number of products will be delivered that all organizations will, by law, need to make use of internally. It is the "Gold" standard so to say. This means that exchange will be two-way. Data producers will also be data consumers for those parts of the main database they do not maintain themselves, and must have facilities to consume change updates (e.g. like in geodatabase replication).

Currently, this process is ongoing. Of course, it will be a headache to some, but all major CAD / GIS software vendors here in the Netherlands have committed to this process and are starting, and most already have, implemented tools to facilitate this "change update" process and creation of standard compliant GML exchange format with the main database facility organization.

What problems lie ahead no one can tell yet for sure, but I do know that one of the major players (a country wide governmental organization maintaining the highway system and river / coastal defences), has already been working with a similar "change update" process, storage of history, and stringent automated QC for years in combination with commercial data producers, and despite initial issues, have real world and extensive experience with this. This real world experience will probably pay off in this new project.
0 Kudos