I am trying to run a euclidean allocation/distance command on a medium sized raster (a HUC8 watershed, 8000 x 9000 cells). The in_source_data is a buffered stream network grid whose values are elevations. Both the inputs and outputs are being stored in a file geodatabase, created in ArcGIS 10. I am running it in Python (PythonWin) using the following snippet of pertinent code (after having imported the necessary modules, etc).
It runs seemingly well-- no errors raised. Problem is, when I look at the results, it looks great in some the areas and very incorrect in other areas. (Attached image 1) When you zoom in, you can see that it is, in fact, corrupt and not just a display/pyramids problem.
Is this the results of a memory limitation/error? The memory usage on my machine doesn't seem strained (PythonWin using <500MB out of 2GB available, with no other big processes running). Any other possible causes?