mstjernholm

Clip converts my 16bit unsigned data to 32 bit - bug ?

Discussion created by mstjernholm on Apr 5, 2011
Latest reply on Jan 3, 2013 by MSummers-esristaff
Hi

I'm processing a huge bunch of large images. A part of the operation is to use Clip_management to cut-out a part of the image using a vector feature. Unfortunately Clip always produce a 32-bit unsigned int despite the input is 16-bit unsigned int. I use tiff as output  format and have tried to set the compression to both "NONE" and LZ77" without any luck

How can I force 16-bit output without having to use the processing costly route of:
arcpy.CreateRasterDataset_management(path, filename,PixSiz,pixelType.get(str(ValueType),"16_BIT_UNSIGNED"),outputSR, nBands)
followed by
arcpy.Mosaic_management operation - which beside adding an extra processing step also become quite computing intensive as you can't prohibit pyramids building when having images larger than your screen size or similar !

The problem seems specifically related to the "Clipping with geometry", which I need to use as I :
a) neeed to reduce size to minimu
b) later need to calculate the histogram for a specific area using BuildRasterAttributeTable.

The help does include a warning "If Clipping Geometry is used, then the pixel depth of the output may be promoted. " - But it is not clear when and why the change happens


Is it in reality a bug seen by others ? I run win7 on 64-bit, Arc 10 SP1

All ideas are welcomed

Michael

Outcomes