I have a script which identifies the locations of intersects between n-polygon boundaries and a line (a vector) using shape files without any spatial index.
At present it generates a polyline and uses a spatial filter to identify the relevant polygons, then loops through the vertices of each to find the intersection points. It is rather slow as it has to be repeated thousands of times. Implementing a spatial index will be lengthy as it will mean recoding all the references to shp files to access a gdb throughout the software and as I didn't write it I'd rather restrict my tinkering to the one, easily restored, class.
My question is what alterative functions users might suggest that could be quicker. I am considering afew :
iRay to identify the interesect points Or LocateFeaturesAlongRoutes_lr? Or convert everything to lines and clip to a buffer of the vector first before doing intersect testing..
I am sure there are otherways to skin the cat, but which is likely to see a time saving?
Any suggestions or references to a good source of explainations about the algorithms behind the classes would be welcome also, for example I assume spatial filter deploys some sort of prior minimum bounding geometry test but I'd like to know for sure.
Why don't you call the Intersect Tool via the IGeoProcessor to create all intersection points in one hit rather than looping with a spatial filter? If you need attribute information say from the polyline dataset you could then call the spatial join tool?
I had thought of using an iGeoprocessing intersect method, but I wonder how it actually works. Does it do any pre-tests to focus the selection of polygons? e.g. minimum bounding rectangles or similar? The help for iTopology_intersect method suggests testing for disjoint first to save time, which suggests that is what the geoprocessor should do, but does it?
If it tests against each polygon in the shp file then while the code would be neater it might not be much quicker and could be slower as I would be generating allot of unwanted temporary shape files (the intersect points need to ultimately all end up in the same file).
Can one find out about the algorithms behind the ArcObjects classes? Is it commercially confidential or is there a resource detailing them. I haven't found much in this respect in the Resource Centre pages.
I've never been able to find information about the underlying algorithms used by ESRI geoprocessing tools. Unless they are misbehaving, who cares how they do it as long as they are fast and accurate?
As I understand your initial description you have 1 shapefile with a single polyline in it and you are trying to find the intersection points in many polygon shapefiles, is this correct? If so how many shapefiles are you talking about? You could consider using in_memory layers if you are generating a few layers with a few points as a way of speeding things up?
The line is currently being created as a polyline, but is not a second data set. I am tracing many transects accross a dataset and finding what polygon boundaries they intercept, then picking up the attributes either side of that point. The transects start from a subset of the segments of each of those same polygons, hence the opperations multiply up pretty fast. So, I don't necessarily want to maintain the geometry of the intersection points, but equally don't want have to find each location individually as is presently implemented.
The InMemoryWorkspaceFactory is a new one to me and combined with a geoprocessing function (e.g. all the transects in once go) could be what I am looking for.