ESRI Enterprise geodatabase disagrees with ESRI File Geodatabase on what is a valid shape

871
3
03-13-2018 07:28 PM
JohnCuthbertson
New Contributor II

Some time ago I raised a discussion titled 'ESRI and Microsoft SQL disagreeing on what is a valid shape'. The response from that gave me a far greater understanding of Resolution/Tolerance and the differences between the ESRI and Microsoft SQL Worlds. This discussion is now looking at ESRI Enterprise Geodatabase and ESRI File Geodatabase having differing points of view on what is valid.

We are running 10.5.1 using SQL Server 2016.

The steps I took were

  • Ran FME (2016) to read a SQL table containing 600K polygons and write to an ESRI Enterprise Geodatabase with default Resolution/Tolerance (0.000000001/0.000000008983153)
  • My understanding is that ESRI will not allow invalid (eg self-intersecting) polygons into the  Enterprise Geodatabase . This is confirmed by ArcCatalog managing to display the results.
  • Using ArcCatalog I then did an Export to a file geodatabase with the target having default Resolution/Tolerance (0.000000001/0.000000008983153)
  • I then used the Toolbox DataManagementTools -  Features - Check geometry against the File geodatabase.
  • This identified 315 'self-intersections'.

This appears to me to be an illogical situation. How can Enterprise geodatabase load an apparent 'self-intersecting' polygon without detecting an error. (Or is logic in 'Check geometry' different to that in Enterprise input processing?)

My next step is to try loading from SQL to the File Geodatabase (using FME) and to identify and resolve 'self-intersections', but I thought it worth raising the issue to a wider community before I waste time going down a wrong path.

0 Kudos
3 Replies
JohnCuthbertson
New Contributor II

I have progressed further down the path and I am documenting my findings here to perhaps help some other poor soul who is travelling down the same path. I am not an ESRI expert, my findings are based purely on what I see.

I had previously said we were dealing with polygons, to be more precise we are dealing with polygons and multipolygons.

If you have non-ESRI data and want it in an Enterprise geodatabase (and nowhere else) then using FME with an input of SQL data, passing through an FME Geometry-Validator is a reasonable way to go.

If however, you then take the resultant  Enterprise Layer and move to a File Geodatabase you may well find additional 'Self-intersections'.

On the other hand if you want to be able to use the data in both File Geodatabase and Enterprise geodatabase then the easiest path is to use FME to take the non-ESRI data and output it directly to a File Geodatabase (do not use FME Geometry-Validator as it produces many false positives). Once it is in the File geodatabase you can then use ESRI Repair-Geometry to fix the self-intersections (hopefully). The resultant File Geodatabase layer can now be copied to an Enterprise geodatabase with confidence (an added bonus is that it can be exported to a SHAPE file as well).

My overall conclusion (if someone more knowledgeable can confirm) is that the File Geodatabase is more aligned with a SHAPE file than an Enterprise geodatabase. Perhaps the accuracy that a Fiile geodatabase stores its data is the same as that used for a SHAPE file?

0 Kudos
JoshuaBixby
MVP Esteemed Contributor

What coordinate system and projection are you using?  When you say "default Resolution/Tolerance," where is the default coming from? 

When I create a new feature class in a 10.6 enterprise geodatabase in SQL Server, the "defaults" for UTM Zone 15N are 0.0001 XY Resolution and 0.001 XY Tolerance, which agrees with Esri's documentation on XY Tolerance (Environment setting)—Help | ArcGIS Desktop:

  • For tools like Create Feature Class, Create Feature Dataset, and Create Raster Catalog, the default x,y tolerance for the output geodataset is 0.001 meters (1 millimeter) or its equivalent in map units. This environment can be set to a different value if the default is not acceptable.

The "default Resolution/Tolerance" you show are incredibly small.  Whether working with meters or feet, you are dealing with nano-level precision, and I have never worked with a real-world data set that even gets within several orders of magnitude of that level of precision.

0 Kudos
JohnCuthbertson
New Contributor II

Joshua

I am not working in feet/metres, but in degrees and using following coordinate system

GCS_GDA_1994
WKID: 4283 Authority: EPSG

Angular Unit: Degree (0.0174532925199433)
Prime Meridian: Greenwich (0.0)
Datum: D_GDA_1994
  Spheroid: GRS_1980
    Semimajor Axis: 6378137.0
    Semiminor Axis: 6356752.314140356
    Inverse Flattening: 298.25722210

If I am using ArcCatalog and am in a FGDB, I right click-New-Feature Class, specify Name, hit next, select 'Geocentric Datum of Australia 1994', then next and it prompts me for tolerance with a default of '0.000000008983153' (If I hit 'Reset to Default' it is unchanged). I then hit next, leave storage config as default, hit next and finish

0 Kudos