Dreaded Error 999998 on large merge.
I'm doing a near analysis on one shapefile that needs to be split along boundaries. The reason for this requirement is that although an object within one boundary may be spatially closer to the target of the near analysis, it cannot be linked to that object for other reasons. The script is essentially a near analysis done based on spatial location relevant to another layer.
My process is as follows:
Select objects inside boundary N, copy to in_memory for all shapefiles required for near analysis, perform near analysis on those files. N+=1. These files are named something along the lines of ' "name_"+str(N) '
repeat 2800 times for each boundary required.
results in 2800 shapfiles in memory. Each only about 50kb tops. (i've run the same program storing things locally instead of in_memory, but it slows it down immensely (by a factor of 10+)
after this loop, I call an arcpy.merge function on a list containing all of the names of the files to be merged.
This is where the program crashes. (But only when it's run on the entire data set! (2800 boundaries))
Strangely enough, the program runs just fine if I select smaller groups of files and merge them to make a file and append that file to the final output file.
Is there some limit on the number of files that can be merged at once?
I've "solved" the problem first by changing "merge" to "append" and then by manually going through the map, selecting a set of boundaries, and running the script on only those boundaries and repeating ad nauseam until every boundary has been appended. I would really prefer not to have to manually select these boundaries and run the script this many times, so this is not really a solution, more of a temporary workaround.