Python won't overwrite scratch rasters in loop - Arcpy - ArcGIS10.0

5649
7
05-29-2013 06:03 PM
DavidSmith10
New Contributor II
When attempting to delete or overwrite a scratch raster in a raster geodatabase, I keep getting errors along the lines of "Error:  000871 : Unable to delete the output"

I'm working on an iterated series of raster processes, which involves a lot of intermediate rasters for each iteration. I'm forced, for dumb engineering reasons, to use decimal seconds, so only saved rasters in a file geodatabase seem to work - IMGs and GRIDs and temporary grids cause errors in decimal seconds.

The problem I have is this. For some relatively long (geoprocessing wise) period of time after I've finished reading from or writing to a raster, the raster remains locked, and cannot be overwritten nor deleted. Currently, I'm working around this problem by writing each intermediate raster to a unique filename. This leaves me with a massive collection of unneeded intermediate files cluttering up my geodatabase.

I have tried just setting env.overwriteOutput = True, and just writing/overwriting a scratch filename each iteration, and I've tried using the arcpy.Delete_management() function at the end of each iteration, and still get the ERROR 000871.

Is there a way to force arcpy/Arc/python to release a file in the operating system, or is there something I'm leaving out that would cause them to be released? My google fu is failing me.

FWIW, running ArcGIS 10.0 (ArcInfo) with Spatial Analyst on Win7 64-bit.
Tags (2)
7 Replies
DanPatterson_Retired
MVP Emeritus
make sure that the outputs aren't being added to the dataframe otherwise you will automatically have a file lock.
0 Kudos
DavidSmith10
New Contributor II
I'm running it standalone from a batch script, so there isn't a dataframe. I'm new to this level of geoprocessing from scripts (I have traditionally done this kind of work in ERDAS Imagine, but needed more automation for this particular task), so it's possible that there's something similar that's keeping things in memory longer than I'd like. But I can't figure out what it might be.
0 Kudos
JamesCrandall
MVP Frequent Contributor

so only saved rasters in a file geodatabase seem to work - IMGs and GRIDs and temporary grids cause errors in decimal seconds.


We've just resigned to the fact that the system engineers and developers at ESRI must have integrated the FGDB into the software (Default.gdb) for a good reason and that perhaps it's a good idea to just use them too.  Most raster processing problems we've encountered have been overcome by switching to FGDB's as the default output containers.

I think you have made a good choice in doing the same.



The problem I have is this. For some relatively long (geoprocessing wise) period of time after I've finished reading from or writing to a raster, the raster remains locked, and cannot be overwritten nor deleted. Currently, I'm working around this problem by writing each intermediate raster to a unique filename. This leaves me with a massive collection of unneeded intermediate files cluttering up my geodatabase.


I'm a little confused: can you not delete each unique raster?  You should be able to and seems like an okay approach to get around your scratch raster issue. You may have to post up some code snippets.
0 Kudos
RhettZufelt
MVP Frequent Contributor
If you have that raster assigned to an active variable, it will often lock that dataset.

I.e,

MyRast = "\\\\wc98466\\D\\baks\\temp.gdb\\MyRaster"

or 

sCur = arcpy.SearchCursor( "\\\\wc98466\\D\\baks\\temp.gdb\\MyRaster")



could both lock my dataset to where I can't remove it.
So, one has to delete the variable first, that will (or supposed to) remove the lock and let you clobber it (or delete it, for some reason, you can't overwrite some objects, and need to be deleted, then replaced).

MyRast = "\\\\wc98466\\D\\baks\\temp.gdb\\MyRaster"

or 

sCur = arcpy.SearchCursor( "\\\\wc98466\\D\\baks\\temp.gdb\\MyRaster")

del MyRast
del sCur



Just a thought,
R_

Also,  if the raster is within a feature dataset in the FGDB, if ANY objects from that feature dataset are in use, it will put a lock on the ENTIRE feature dataset.  Can get around this issue by just having them in the base level in the FGDB.

Also2, since these are temporary rasters that you are trying to clobber each loop, have you tried storing them as in_memory objects?  ("in_memory/TmpRaster") rather than taking time to write to disk? (some tools won't allow this IE, projectRaster)
0 Kudos
DavidSmith10
New Contributor II
Thanks R_

I'm honestly not sure that's my problem, because while I usually use a variable to store my filenames and paths, out of some vague notion that that's a programming best practice, my most recent bump against this issue involved me directly adding the path/name to the code.

Here's the offending code:

arcpy.PolygonToRaster_conversion(bldglayer, "bldgtype", 'workingFGDB.gdb\\bldgtile' + str(i) + '_' + str(j), "#")
bldgnull = sa.IsNull('workingFGDB.gdb\\bldgtile')
bldgnull.save('workingFGDB.gdb\\bldgtilenull')
nobldgtile = sa.Con(lulcnullname, 0, geolulcname, "VALUE > 0")
nobldgtile.save('workingFGDB.gdb\\nobldgtile')
lulctile = sa.Con('workingFGDB.gdb\\bldgtilenull', 'workingFGDB.gdb\\nobldgtile', 'workingFGDB.gdb\\bldgtile', "VALUE > 0")
lulctile.save(lulctilename)



For reference, this is taking a set of building classes from a building outline polygon file and adding them to a land-use/land-cover dataset. The second iteration of this bit of code results in a cannot-delete-dataset error. Once I changed the various intermediate "bldgtilenull/nobldgtile/etc' names to variables with the iteration index tacked on, it worked fine (except for filling my FGDB with files I no longer needed).

A workaround would be to just toss those raster names in a list and then to an iterative delete at the end of the process to remove them, but I'd rather do it as I go just using the 'overwriteOutput' environment variable.
0 Kudos
curtvprice
MVP Esteemed Contributor
For reference, this is taking a set of building classes from a building outline polygon file and adding them to a land-use/land-cover dataset.


Here's an attempt to keep your temp datasets temporary (and have them deleted by default). I also set things up to work withou tthe sa prefixes so the map algebra is easier to read.

from arcpy.sa import *
from arcpy.env import env
env.overwriteOutput = 1
# for best performance of map algebra, scratch and current should be the same
env.workspace = arcpy.env.scratchWorkspace = workingFGDB.gdb
lulc = Raster(lulcName) # convert to raster object for use in map algebra
...
  bldgtileName = "bldg{0}_{1}".format(i,j)
  lulctileName = "lulc{0}_{1}".format(i,j)
  arcpy.PolygonToRaster_conversion(bldglayer, "BLDGTYPE", bldgtileName)
  bldgtile = Raster(bldgtileName) # convert to raster object (note, this is a permanent raster)
  nobldgtile = Con(IsNull(lulc), 0, lulc) # convert any null LULC cells to zero 
  # replace null cells in bldgtile with lulc cells
  lulctile = Con(IsNull(bldgtile), nobldgtile, bldgtile)
  lulctile.save(lulctileName)  # this is the only one you want to save
  # arcpy.Delete_management(bldgtile)  # uncomment to delete


Note, the details on handling raster objects are documented here:

http://help.arcgis.com/en/arcgisdesktop/10.0/help/index.html#/The_interaction_of_the_Raster_object_i...

My understanding is that the way to delete raster objects created using map algebra is not using arcpy.Delete. You should use del or just let Python delete it when you reassign the variable on the next iteration and when the script closes after the last iteration.
0 Kudos
by Anonymous User
Not applicable

Wrapping my code in a function has solved this issue for me. As in:

def mf_function():
    arcpy.PolygonToRaster_conversion(bldglayer, "bldgtype", 'workingFGDB.gdb\\bldgtile' + str(i) + '_' + str(j), "#")
    bldgnull = sa.IsNull('workingFGDB.gdb\\bldgtile')
    bldgnull.save('workingFGDB.gdb\\bldgtilenull')
    nobldgtile = sa.Con(lulcnullname, 0, geolulcname, "VALUE > 0")
    nobldgtile.save('workingFGDB.gdb\\nobldgtile')
    lulctile = sa.Con('workingFGDB.gdb\\bldgtilenull', 'workingFGDB.gdb\\nobldgtile',        'workingFGDB.gdb\\bldgtile', "VALUE > 0")
    lulctile.save(lulctilename)

 

0 Kudos