The effect of increasing the XY tolerance in the performance,

621
6
05-04-2014 11:26 AM
JamalNUMAN
Legendary Contributor
The effect of increasing the XY tolerance in the performance,

I�??m wondering if increasing the XY tolerance (say 0.00000001) of a particular enterprise layer would slow down the performance.

[ATTACH=CONFIG]33572[/ATTACH]

The reason for that is my enterprise layer may receive polygons of very small areas (due to conversions from CAD by using FME) and I don�??t want to consider these small areas as errors.

Thank you

Best

Jamal
----------------------------------------
Jamal Numan
Geomolg Geoportal for Spatial Information
Ramallah, West Bank, Palestine
6 Replies
MarcoBoeringa
MVP Regular Contributor
The reason for that is my enterprise layer may receive polygons of very small areas (due to conversions from CAD by using FME) and I don�??t want to consider these small areas as errors.


Do you really want to maintain these, most likely, sliver polygons in your final layers? They are likely to cause issues or at least confusion later on when the data is used or analysed. You may wish to have a look at the options presented on this page:

Removing slivers or gaps between polygons

I realize none of the options presented there is "ideal/perfect" (but then the real world isn't :eek:). Setting up a Topology will give you most control over what is happening, and where it is happening, but requires manual intervention. Integrate is automatic, but requires a very careful setting of the tolerance and experimentation so as to not loose data that you want to maintain.

I also see there is another tool presented on the first linked page that you may wish to explore: Eliminate, also helping you with cleaning up slivers.

You may wish to use a two step approach:

- Run Integrate (and / or Eliminate) with an effective, but conservative(!), setting so as the clean up the bulk of tiny slivers or spurious small polygons.

- Then create a Topology of the resulting layer, and clean up the rest manually assisted by the Topology tools.
0 Kudos
VinceAngelo
Esri Esteemed Contributor
I've cut storage size in half and doubled full table scan performance by removing five orders of magnitude from the X/Y scale. It follows that increasing by five orders of magnitude and increasing the polygon complexity by including microscopic features would slow all processing by three to ten times, possibly as much as two or three orders of magnitude (100x -1000x). - V
0 Kudos
JamalNUMAN
Legendary Contributor
I've cut storage size in half and doubled full table scan performance by removing five orders of magnitude from the X/Y scale. It follows that increasing by five orders of magnitude and increasing the polygon complexity by including microscopic features would slow all processing by three to ten times, possibly as much as two or three orders of magnitude (100x -1000x). - V


Many thanks Marco and Vince for the help,

I�??m correct if I say YES. Increasing the XY tolerance will decrease the performance.

In the same context, what might be the effect on increasing the resolutions?

[ATTACH=CONFIG]33596[/ATTACH]
----------------------------------------
Jamal Numan
Geomolg Geoportal for Spatial Information
Ramallah, West Bank, Palestine
0 Kudos
MarcoBoeringa
MVP Regular Contributor
In the same context, what might be the effect on increasing the resolutions?


A potentially unwanted loss of accuracy / precision of the stored coordinates. With the default, you can reasonably store measurements up to millimetre precision, changing the default resolution to something like 0.1 m, will not allow that.

Also see this Help page:

The properties of a spatial reference

And mind the following quote from that page:

"Esri strongly recommends using the default x,y resolution in most cases because it has proved to perform quite well and can store adequate coordinate precision for most situations."

Also see this quite interesting Help topic. Although it is related to Parcel Fabrics, it gives some nice background regarding historic surveys / measurements and what "accuracy" they represented:

About accuracy
0 Kudos
VinceAngelo
Esri Esteemed Contributor

I�??m correct if I say YES. Increasing the XY tolerance will decrease the performance.


First you'd need to specify what you mean by "increase".  Are you operating on
a numerator or a denominator?  Changing from 0.001 meters (one millimeter)
to 0.000000001 meters (10 nanometers) is decreasing the value, but increasing
the precision. Changing from 0.01 meters (one centimeter) to 0.1 meters (one
decimeter) increases the value and increases the tolerance.

I would say that increasing the default precision without justification for highly
accurate data (e.g., a scanning electron microscope) is wasteful of storage and
processing capability.  Increasing the tolerance value will add "fuzziness" to your
location data under some processing conditions; the full impact depends on the
data and the processes.

- V
0 Kudos
JamalNUMAN
Legendary Contributor
First you'd need to specify what you mean by "increase".  Are you operating on
a numerator or a denominator?  Changing from 0.001 meters (one millimeter)
to 0.000000001 meters (10 nanometers) is decreasing the value, but increasing
the precision. Changing from 0.01 meters (one centimeter) to 0.1 meters (one
decimeter) increases the value and increases the tolerance.

I would say that increasing the default precision without justification for highly
accurate data (e.g., a scanning electron microscope) is wasteful of storage and
processing capability.  Increasing the tolerance value will add "fuzziness" to your
location data under some processing conditions; the full impact depends on the
data and the processes.

- V


Thank you Marco and Vince for the very useful input. Now, it is much more obvious for me

Best

Jamal
----------------------------------------
Jamal Numan
Geomolg Geoportal for Spatial Information
Ramallah, West Bank, Palestine
0 Kudos