The reason for that is my enterprise layer may receive polygons of very small areas (due to conversions from CAD by using FME) and I don�??t want to consider these small areas as errors.
I've cut storage size in half and doubled full table scan performance by removing five orders of magnitude from the X/Y scale. It follows that increasing by five orders of magnitude and increasing the polygon complexity by including microscopic features would slow all processing by three to ten times, possibly as much as two or three orders of magnitude (100x -1000x). - V
In the same context, what might be the effect on increasing the resolutions?
I�??m correct if I say YES. Increasing the XY tolerance will decrease the performance.
First you'd need to specify what you mean by "increase". Are you operating on
a numerator or a denominator? Changing from 0.001 meters (one millimeter)
to 0.000000001 meters (10 nanometers) is decreasing the value, but increasing
the precision. Changing from 0.01 meters (one centimeter) to 0.1 meters (one
decimeter) increases the value and increases the tolerance.
I would say that increasing the default precision without justification for highly
accurate data (e.g., a scanning electron microscope) is wasteful of storage and
processing capability. Increasing the tolerance value will add "fuzziness" to your
location data under some processing conditions; the full impact depends on the
data and the processes.