Select to view content in your preferred language

Setting Raster pixel depth and type in batch scripts?

457
1
06-30-2011 09:45 AM
MichelleKoo
New Contributor
Does any one know of a way to set the output raster pixel depth and type when batch processing several rasters in either ArcGIS 9.3 and 10 (commandline ArcObjects or Python, respectively) or in ArcInfo (AML)? I am perplexed by the output rasters of several operations (done in all the above languages and environments!) since they range from 8bit unsigned and signed to 16 bit signed, when I am starting with input at 32 bit signed!!!

Does it go the lowest common denominator? Is there some "pixel-saving" setting in ArcGIS environments so that it chooses one based on the output values? (it's true that some of the values are all positive integers). I have looked through help files in all the above to see if there is prior setting to use in the scripts, but am stuck now.

The only way I seem to be able to change it is to use "copy raster" tool on the ones that the 8 bit ones to make them all 16 bit, which seems an unnecessary and clunky workaround step. 

Has anyone else had this issue?
Thanks for any help, or commiseration.
MK
0 Kudos
1 Reply
BryceStath
Emerging Contributor
I am having a very similar issue and another issue where the mosaic tool will not work within a python script because it's telling me my rasters are of different types, which they are not. 
If I could specify the output raster type when I export from Geostatistical Layer to Raster, or knew where this environment was stored, I would change it.  But too can find nothing.
0 Kudos