Does any one know of a way to set the output raster pixel depth and type when batch processing several rasters in either ArcGIS 9.3 and 10 (commandline ArcObjects or Python, respectively) or in ArcInfo (AML)? I am perplexed by the output rasters of several operations (done in all the above languages and environments!) since they range from 8bit unsigned and signed to 16 bit signed, when I am starting with input at 32 bit signed!!!
Does it go the lowest common denominator? Is there some "pixel-saving" setting in ArcGIS environments so that it chooses one based on the output values? (it's true that some of the values are all positive integers). I have looked through help files in all the above to see if there is prior setting to use in the scripts, but am stuck now.
The only way I seem to be able to change it is to use "copy raster" tool on the ones that the 8 bit ones to make them all 16 bit, which seems an unnecessary and clunky workaround step.
Has anyone else had this issue?
Thanks for any help, or commiseration.
MK