How not to Copy Features when too many would be output?

698
5
Jump to solution
02-23-2012 03:26 PM
GraemeBrowning
Occasional Contributor III
This question may or may not be better posed to the Python forum rather than this Geoprocessing forum - but I'll try here first.

I have a 3.5 million polygon layer (cadastre) stored in Oracle via ArcSDE.  I use a polygon feature chosen by an end user from a second layer to copy out the cadastral polygons that intersect it (via Copy Features).  I can get this to work fine for most polygons to get the few seconds to few minutes response needed by using the polygon the user chooses to set the geoprocessing extent (arcpy.env.extent) prior to executing Copy Features.

However, for a few polygon features that the user might choose (think petroleum or gas pipeline corridor), the number of polygons that would be copied out can number up to about 200,000.  These can be copied in about 15 mins, but rather than get all of these back so that I can write them into a very bloated PDF report, I would like to save time and pages by having the option to set a maximum number of features that can be copied out i.e. be able to set a MaxRecordCount = 500 that would stop the Copy Features once 500 had been copied.

Is there a way to tell Copy Features to only copy up to a maximum number of features?

Alternatively, is there a fast way to get a count of how many features in an ArcSDE layer are within the current geoprocessing extent?
0 Kudos
1 Solution

Accepted Solutions
KimOllivier
Occasional Contributor III
I used to have this problem with ArcStorm Libraries. How to stop unreasonable requests from even starting.

My solution was to add a spatial index of my own that has a summary feature count. This is selected first and if there are too many tiles or too many features then the command is not run. The tiles can be a quartered pattern in size just like ArcStorm to allow for rural and urban density differences.

It might take a little time to experiment on the tile sizes and count the features, but it does not need to be updated for normal editing because it will be near enough.
ArcStorm automated the quartered tile building, but I cannot see it would take much to replicate in a python script, or use the Google tile pattern perhaps.

View solution in original post

0 Kudos
5 Replies
DuncanHornby
MVP Notable Contributor
Graeme,

Just thinking off the top of my head here (also assuming you are using ArcGIS 10). Why don't you create a model and then use a get count tool to feed into the calculate value tool to do an if/then statement the output being a boolean.  This could act as a precondition to the copy features tool?

Duncan
0 Kudos
GraemeBrowning
Occasional Contributor III
Thanks Duncan

I need to revisit the timings on a Get Count on a selection from a 3.5 million row ArcSDE for Oracle feature class but I think it is in the range 8-15 mins so that is one of the ArcGIS limits that I am looking for a way around.

- Graeme
0 Kudos
GraemeBrowning
Occasional Contributor III
An ArcGIS Idea has now been posted for this to request that Copy Features be enhanced to stop copying features when a specified maximum number has been reached.
0 Kudos
KimOllivier
Occasional Contributor III
I used to have this problem with ArcStorm Libraries. How to stop unreasonable requests from even starting.

My solution was to add a spatial index of my own that has a summary feature count. This is selected first and if there are too many tiles or too many features then the command is not run. The tiles can be a quartered pattern in size just like ArcStorm to allow for rural and urban density differences.

It might take a little time to experiment on the tile sizes and count the features, but it does not need to be updated for normal editing because it will be near enough.
ArcStorm automated the quartered tile building, but I cannot see it would take much to replicate in a python script, or use the Google tile pattern perhaps.
0 Kudos
GraemeBrowning
Occasional Contributor III
Thanks Kim

I think that is a great workaround - and bonus points for it being ArcStorm inspired.

I wrote a Python script to do adaptive tiling for this same client when we were encountering a few features with millions of vertices a year or two ago so we should be able to get some more value from that. 

It too was inspired by the way ArcStorm created its tile index.

I'm hoping my MaxRecordCount enhancement will get implemented all the same because in this instance I know my client will be happy with a "too many records so won't report on any" response but their original request was to stop copying/reporting at 500.

- Graeme
0 Kudos