Terrain to Raster Out of Memory Error

3982
8
02-26-2014 05:50 PM
by Anonymous User
Not applicable
Original User: leonardlluz

A little background of my problem.

I have a terrain dataset which I want to convert into raster to get the DEM. However, I keep on receiving a generic error which tells me about out of memory problem. I tried it on 2 other machines with the same tech specs and ArcGIS build as mine but I still got the same error.

Is this a bug in 10.2? I attached a screenshot of the error message and the specs of my workstation. Any help would be greatly appreciated. 🙂

-Leonard

[ATTACH=CONFIG]31810[/ATTACH][ATTACH=CONFIG]31811[/ATTACH]
0 Kudos
8 Replies
curtvprice
MVP Esteemed Contributor
I tried it on 2 other machines with the same tech specs and ArcGIS build as mine but I still got the same error.


How big a raster are you trying to create (rows x columns)?  I would try setting the environment to a smaller GP extent, and see if it will work if you limit the processing area. You may have to do it in pieces and use the Mosaic tool to build your output (assuming you want it all in one raster, not always the best approach for large rasters).

An Esri grid of that size would be of the uncompressed size <rows> x <columns> x 4 / (1024 * 1024) megabytes.
0 Kudos
by Anonymous User
Not applicable
Original User: leonardlluz

Thanks Curt!

I can't answer your question regarding the raster size since I'm just converting a terrain dataset to raster and the tool doesn't finish processing. However, it produced an incomplete DEM with this dim 19743 x 38266.

When we try to run it in a different machine (specs is much lower Quad-core, 4 gb mem, 32-bit), it completed successfully. I'm not sure if this is about hardware or memory leaks, etc.
0 Kudos
curtvprice
MVP Esteemed Contributor
I can't answer your question regarding the raster size since I'm just converting a terrain dataset to raster and the tool doesn't finish processing. However, it produced an incomplete DEM with this dim 19743 x 38266


You can calculate the rows and columns from the extent in the python window, for example if your cell size is 30:

e = arcpy.Describe("myinput").Extent
print e.width / 30, e.height / 30


I believe you are hitting the 2.1GB wall:

>>> 19743 * 38266 * 4 / ( 1024. ** 2)
2881.9489974975586


When you get beyond 2.1G when you try to write this to a single file (.tif) you can hit the 32-bit operating system limit and the tool will fail. (Grids are unlimited in size as they tile into separate <2.1G pieces).  There is also sometimes an issue with needing more memory, especially for operations like Terrain to raster. It may help to run this in the background (if 64 bit geoprocessing is installed: 10.1 SP1 or later).

A neat trick to use when saving to tiff is to integerize the output as it is compressed size that counts toward the 2.1G file size limit -- elevation surfaces often compress well.

Again, the solution is to process in smaller chunks and Mosaic them together. If there is memory swapping involved, it may work faster in smaller pieces anyway (I'm talking 10x or more faster).
0 Kudos
by Anonymous User
Not applicable
Original User: leonardlluz

A neat trick to use when saving to tiff is to integerize the output as it is compressed size that counts toward the 2.1G file size limit -- elevation surfaces often compress well.


This is very helpful Curt! Thank you so much. When you say integerize the output, is there a way to do this trick using ArcMap/ArcCatalog? Pardon me but I'm not really familiar with this method.
0 Kudos
curtvprice
MVP Esteemed Contributor
When you say integerize the output, is there a way to do this trick using ArcMap/ArcCatalog?


There is a tool to integerize rasters: Int.

You can round the data by using that tool inside the Raster Calculator. Integerizing the data (and thus improving the compression) may be the only way you can create a .tif file that is less tha 2.1G so that ArcMap (32-bit) can read it. Again, tiling your data should be considered when pass that 2.1G "wall."

Int("floatgrid" + 0.5)

If you need more detail (say, hundredths in the Z) a trick is to scale your data so you can store it as integer, as long as you remember to use a Z factor of .01 (assuming your XY units are the same as Z) for tools that need it like the Slope tool.

Int("floatgrid" * 100 + 0.5)
0 Kudos
by Anonymous User
Not applicable
Original User: swalbridge

This doesn't help with the question at hand, but I just wanted to mention that the problem probably isn't the size of the TIFF -- ArcGIS has supported BIGTIFF since ArcGIS 10.0, and can write TIFF files of virtually unlimited size.
0 Kudos
LeonardLuz3
Occasional Contributor
I'm still stuck with this problem and one thing I'm suspecting as the cause is a memory leak. Upon running the tool, ArcCatalog consumes up to 5gb of my RAM until the tool fails. I can't find any other similar cases over the net either. Any thoughts are very much welcome.
0 Kudos
by Anonymous User
Not applicable
Original User: swalbridge

If you want to try and get the whole terrain to run at once, a good starter option would be this option:

arcpy.env.terrainMemoryUsage = True


As mentioned in the documentation, this will work to minimize memory usage.

Alternatively: you could use background processing to do the job, which can take advantage of your entire memory space, and not the rather small portion available to a 32-bit foreground task. Failing the two options above, you can do what Curtis suggested earlier in this thread, and do the conversion on pieces of the Terrain (instead of the whole thing at once). This can be accomplished by partitioning up the terrain into smaller areas for analysis, using something like a fishnet to define each 'slice', and then combining them at the end of the process. But I'd try the first two options first.

Cheers,
Shaun
0 Kudos