Raster histogram different between debug and release?

07-18-2013 09:54 AM
New Contributor

I'm working on a program that creates a raster in memory, saves it to a file geodatabase, then adds it to the TOC. When the code runs in debug mode the code runs fine  (C#, Visual Studio 2012, all ESRI binaries loaded from GAC). When the extension is run directly from ArcGIS Desktop the histogram for the output raster is way off, causing the stretch to look very bad. I checked the "bad" histogram by running a Sample operation and looking at the statistics for the output column. Please see the images below to see what I mean.

Here is the code block I'm using to calculate the statistics. As you can see I'm trying just about everything, although maybe not in the correct order. I've tried placing this block before saving to the geodatabase, after saving, and after adding the layer to the TOC.

Note: pRDS is an IRasterDataset

                            IRasterBandCollection pRastBands = (IRasterBandCollection)pRDS;
                            IEnumRasterBand enumRasterBand = (IEnumRasterBand)pRastBands.Bands;
                            IRasterBand pRastBand = enumRasterBand.Next();
                            int i = 0;
                            while (pRastBand != null)
                                pRastBand = enumRasterBand.Next();

As a temporary fix, when adding the layer to the TOC I use IRasterStretch2.StretchStatsType to set the stretch type to "AreaOfView". That makes the raster look better but doesn't solve the underlying histogram problem.

Any help would be much appreciated!

Jim Dillon

"Bad" histogram:

"Good" histogram:

Stats from Sample operation on raster with "bad" histogram:
0 Kudos
0 Replies