POST
|
I'm doing a lot of calculations based on elevation values. The current problem involves alternative methods for calculating and manipulating slope and aspect and derivative products, but that's not the point now. I could tile the datasets into smaller pieces, and maybe will have to do that if no other method presents itself, but that's an inelegant and unsatisfactory solution. Because of edge effects, the tiles would need to overlap, and then the overlaps would need to be removed when joining the processed tiles back together. It could be automated, and perhaps someone else has already written a script to do that, but I'm looking for a better and more fundmental answer. I'm trying to use the ESRI geoprocessor and related available tools, without the user needing to install anything extra. All of our GIS work is done in the ESRI environment. One good reason for staying with SA is the ease in making calculations based on movng (running) windows, such as focal stats and kernels. I don't see any way around the precision problem for large rasters, with the current version of ArcGis. Map Algebra can handle large rasters (albeit slowly), and Python/numpy can handle high-precision numbers, but I don't have a way to handle both at the same time.
... View more
11-30-2011
07:48 AM
|
0
|
0
|
663
|
POST
|
I've been practicing recently with numpy for certain tasks. It works nicely (using indices) for returning the row and column numbers for small and medium-sized rasters, up to a few thousand cells on a side. When I work with large rasters (10000-by-10000 or larger), I get a Python Memory Error. It seems that numpy is not an option for doing computations with data for large rasters. (I'm running ArcGis 10 experimentally on a Windows Server 2008 machine, 64-bit, with supposedly 48 GB of RAM. Yes, I understand that by my definition a large raster could take up several GB when held in memory.) Is there a practical procedure out there for returning row and column numbers for large rasters? I've seen suggestions for using fishnet or flowaccumulation, but am guessing that the steps involved would make for verrry slow processing. I'm also running into precision problems when working with large numbers in Map Algebra. Integer arithmetic works nicely up to 2^31 (about 2 billion), but I have to go to floating-point for larger numbers. Computations involving large numbers, such as sums of squares, begin to lose accuracy and produce unacceptable artifacts in the output. I can get around the precision problems in Map Algebra by converting my raster objects to numpy arrays and defining them as float64. However, as the rasters get larger, then I run into the memory errors again. For example, if I have a 5000-by-5000 raster, I can't keep more than about three variables (as numpy arrays) in play without getting memory errors. Running calculations on data in numpy arrays is very fast, I suppose because everything is held in memory. The limits to that memory, however, render "the numpy route" unusable for working with larger rasters. I like the new Map Algebra, but what was ESRI thinking when they decided to give us this set of "features"? Please give us back $$rowmap and $$colmap. Please let us define rasters to have higher levels of precison. Please give us a supplemental programming option that lets us handle large rasters (arrays). Currently my only option is to write programs that won't work for large rasters. Because I do a lot of work with LiDAR-based elevation data, that's not much of an option. Does anyone have workarounds for these issues? Regards, Tim.L
... View more
11-30-2011
05:57 AM
|
0
|
5
|
1687
|
POST
|
The problem was caused when I ran a block of initialization commands. The offending command was: env.mask = 'None' At version 9.3, setting the mask to 'None' worked fine. At version 10, there is no error generated, but any work done with raster objects results in empty or unusable output. At version 10, the appropriate command in a Python script for removing a mask raster seems to be: env.mask = '' Of course the mask can be set to an existing raster (or featureclass), but apparently not to a raster object. This works: env.mask = 'myRasPath' This bombs: env.mask = myRasObj I didn't find any of this in the documentation, but discovered it through a process of elimination. ESRI probably could improve their documentation regarding valid/invalid usage for 'None' (such as rasterStatistics), commands that seem to return or print the string 'None', and older commands for which 'None' was an aceeptable argument. If anyone has compiled a nice list of all the things that work differently between Python geoprocessing scripts at 10 versus 9.3, I'd love to make use of it. Regards, Tim.L
... View more
11-18-2011
10:15 AM
|
0
|
0
|
209
|
POST
|
This is my first attempt at converting Python from 9.3 to 10.0, and I'm having problems using the raster objects with map algebra. I can create a raster object using Raster('pathname'), but any further use of the raster object in map-algebra expressions doesn't seem to work. Here is a simplified example, from my Python Idle window. My typing is in column 1, output is indented. When the output is 'None", then the expression didn't work properly. Also, when the output is 'None', if I try to save the raster object using rasobj.save('path'), then I get an error message. import arcpy
arcpy.SetProduct('ArcInfo')
u'CheckedOut'
from arcpy import env
from arcpy.sa import *
arcpy.CheckOutExtension('spatial')
u'CheckedOut'
env.workspace = 'c:/workspace'
InRas = 'slope_deg'
InRasObj = Raster(InRas)
print InRasObj.maximum
80.2379150391
Deg2Rad = math.pi / 180.
print Deg2Rad
0.0174532925199
r1 = InRasObj * Deg2Rad
print r1.maximum
None
r1 = Raster(InRas) * Deg2Rad
print.maximum
1.40041577816
r2 = Tan(r1)
print r2.maximum
None
r2 = Tan(InRasObj * Deg2Rad)
print r2.maximum
None
r2 = Tan(Raster(InRas) * Deg2Rad)
print r2.maximum
5.81231069565 I can only use map-algebra expressions successfully if I don't use any raster objects to the right of the equals sign. I have a lot of 9.3 programs to convert to 10, and I was looking forward to using the simplified map algebra instead of the old SOMA. It's slow going, so far. Thanks in advance, Tim.L
... View more
11-11-2011
06:23 AM
|
0
|
1
|
331
|
Online Status |
Offline
|
Date Last Visited |
11-11-2020
02:23 AM
|