Spatial join fails, apparently due to too many fields?

828
1
04-19-2012 10:37 AM
deleted-user-dp61qRbKUaMp
New Contributor II
For some Python scripts I'd written, I noticed spatial joins started giving me generic error messages (one to one, intersect, search radius 1).
To add to the frustration, the join worked fine if done from ArcMap, using exactly the same parameters.

After a lot of head scratching and trial and error, I discovered that the only way to get the join to succeed was if I deleted a few fields from either the target or join features.  The precise number seemed to be random, each feature class being joined only had about 36 fields.

I'm using SP1.  I'm nervous of upgrading because I had installed SP4, but then I had to reinstall everything from scratch due to another bug with Spatial Join (possibly introduced with SP4?), which ESRI is currently addressing.

I did a quick search on this forum and scanned the issues addressed by each SP and didn't see this problem.

Am I alone with this problem?  Does anyone know if it has been resolved by one of the service packs prior to #4?

cheers
Tags (2)
0 Kudos
1 Reply
KimOllivier
Occasional Contributor III
I suspect you are running out of memory. Thirty six fields is not a lot, but what sort of fields are they? You did not say how many records you are joining.
If you do the spatial join with a small sample does it work? Most join operations fail for me if there are over a million records as a rule of thumb.

If you have loaded tables from Excel each field defaults to 255 text chars, even if it is supposed to be a few chars, and numeric fields are double.
Try using schema.ini in conjunction with a CSV dump to control the schema when loading, or load more explicitly using a script.

Maybe look at tuning your system. Defrag, move/delete/redefine your swap space to be huge and static by making min size == max size
Do you have enough spare scratch space, 30% of the disk is recommended. How much free memory do you have before starting?

Overlay processes in workstation used to demand 13 times the size of the source coverages, I suspect the same ratio still applies.
Have you made your scratch workspace a file geodatabase? Otherwise you have a 2GB limit as shapefiles.

A good test is if any single process is taking longer than a cup of coffee, then you have asked for the impossible, and you are probably invoking endless page faults
that will end up crashing hours later with no result. So interrupt it and find a better way.
0 Kudos