Conbining rasters with con/isnull

7888
6
04-30-2015 01:20 AM
NeilAyres
MVP Alum

A few days ago I was building a cost surface (so maybe this belongs in the Spatial Analyst section).

The input was an extensive roads network, with different classes of roads (HIGHWAY", "MAJOR ROAD", etc).

I separated out each type, and rasterized the vectors, with an "impedance" value. This sort of equates to the speed of the road, or rather the times to pass through each pixel of the raster. So a highway would have less impedance than a street.

Then I had to combine these all back together to form my cost surface using con/isnull functions.

I was contemplating how to put together a multiple embedded con / isnull when I though I would give this a try.

It worked perfectly, and was very fast. The raster size was +16000 by +14000.

featList is an ordered in priority list of my road classes.

cntFeat = 0
for r in featList:
    rName = r[1]
    if rName in rasList:
        cntFeat += 1
        print "Processing {}".format(rName)
        if cntFeat == 1:
            Ras = Float(arcpy.Raster(rName)) / 100
        else:
            Ras2 = Float(arcpy.Raster(rName)) / 100
            Ras = Con(IsNull(Ras), Ras2, Ras)
Ras = Con(IsNull(Ras), secsPerCellOther, Ras)
Ras.save("Imp_Final")

This iterative approach is a good one to remember.

btw, the individual inputs were turned into LONG (16bit type) after X 100 just to save some space & time.

I could not get feature to raster to output anything other than a 64bit raster if the input variable was a double.

Also could not get env.compression to work either.

0 Kudos
6 Replies
XanderBakker
Esri Esteemed Contributor

I have seen some discussions on the pixeltype which takes up more space than necessary depending the workspace used (grids vs raster in a fgdb).

What I have done in the past, when I need to combine a lot of rasters with nodata is to use the Cell Statistics (Spatial Analyst) tool. This takes care of the NoData values too.

Neil Ayres , is this a question? If not could you un-mark the discussion as a question?

0 Kudos
NeilAyres
MVP Alum

Xander,

no I suppose this is not a question. Just an interesting method. So, I would if I knew how....

Marked it "Assumed answered" instead, so you got the .

Don't see how Cell Statistics could help for this data. I wanted each new raster to be added only if the existing raster is null. And in my priority order. As the iteration loop indicates.

I was quite surprised how fast this loop was, each only taking a sec or so. The time consuming bit was the save at the end.

Do you have any insight on how or if the bit depth of the output can be controlled?

Or even why the environment compression setting doesn't seem to be honoured.

My version 10.2.2.

0 Kudos
XanderBakker
Esri Esteemed Contributor

Hi Neil Ayres ,

I suppose my procedure was different, but in my case it turned out to be more effective to combine the rasters with the weights using the Cell Statistics tool. Your fast results look promising, will keep it in mind the next time I have a similar challenge.

There are a number of threads on automatic bit depth promotion. Many refer to this Help topic:

Bit depth capacity for raster dataset cells

0 Kudos
curtvprice
MVP Esteemed Contributor
0 Kudos
curtvprice
MVP Esteemed Contributor

> I couldn't get env.compression to work

These settings do not apply to all formats, many only apply to geodatabase rasters.

You also may have better luck getting those raster environments recognized CopyRaster tool instead of the .save() method. (I have learned you can provide a (temp) raster object as input to CopyRaster).

0 Kudos
NeilAyres
MVP Alum

Curtis,

that is what I did afterwards (Copy Raster), to bring the final bit depth down from 64bit to 32bit.

My rasters were in an fgdb.

Maybe it is an idea to have an environment setting for the bit depth?

I see in your 2nd script you pointed to in your link is exactly the same approach.

I am obviously following in the steps of the master...