Good afternoon everyone
I'm sure there is a very easy answer to this question, but I am banging my head against a wall, and I need outside assistance.
I have written a script that iterates through some tiled data, manipulates it and writes the output (which should be a polygon) to a feature dataset within an fgdb.
For some stupid reason, the output is not writing to the correct f/dataset but rather is writing it as a table to the fgdb. Confusingly, the table contains a 'shape' field.
What is more frustrating is that this script has run perfectly on a number of occasions, and I have made no structural changes to it other than changing some workspace paths.
What is even more frustrating is that it has run perfectly since the workspace paths have been changed, but is now falling over for no reason that I can see.
The only possible cause I can think of is that the fgdb structure has become corrupted, as there is quite a bit of data written in and out as part of this process. The tool does compact the fgdb on every 5th iteration, which is nominally when about 750 meg of data has passed through, which seems to be the magic number for improving fgdb performance by compacting.
Is there anything stupid that I have missed? Why would this be misbehaving so badly? It all seems quite arbitrary and I don't quite know why.