Handling with large dataset

684
1
03-09-2014 06:35 AM
RotemGeva
New Contributor
Hello.
I'm working these days on a layer from World Database on Protected Areas.
I found a lot of overlapping features (polygons).
I need to choose one of every overlapping feature.
The problem is that the layer contain up to 27,000 features.
Can someone recommend suitable tools that I can use?

thanks,
Rotem.
0 Kudos
1 Reply
RichardFairhurst
MVP Honored Contributor
Hello.
I'm working these days on a layer from World Database on Protected Areas.
I found a lot of overlapping features (polygons).
I need to choose one of every overlapping feature.
The problem is that the layer contain up to 27,000 features.
Can someone recommend suitable tools that I can use?

thanks,
Rotem.


What you mean by you want to choose ove of every overlapping feature?  Do you want to eliminate the overlapping portion from all but one polygon?  Or do you just mean you want to select them?

To just select them and isolate them from non-overlapping polygons, use the Spatial Join tool where the layer is spatially joined to itself with a small negative buffer that is bigger than your minimum tolerance using the One to Many option.  Then select all Target_FID values not equal to Join_FID in the output.  These are the overlapping polygons.

To eliminate overlapping portions try the Integrate tool on a copy of the data (not the original).  After that you could use the Union tool, then the Multipart to Singlepart tool, and then the Eliminate tool if you have an Advanced license to find overlaps that are bigger than your Integrate tolerance.  You should try different eliminate settings and examine the results to decide your tolerance for sliver polygons.   Wherever the two source ObjectID fields created by the Union output don't match there is an overlap.

The Union and Multipart to Singlepart tools will make the data set a really large dataset if there are a lot of overlaps, but that is the reality of your data.  Only 27,000 features is actually very moderate in size.  I have done manual editing of huge portions of datasets that size over a day or two to deal with this sort of thing when I really wanted data that met my exact specifications.

If you don't have an Advanced license or don't like the result, search the python forum for code that deals with sliver merging.  It is a very common topic.
0 Kudos