Select to view content in your preferred language

Geoprocessing Bug - scripting fails when producing 1000s of viewsheds

877
3
08-14-2010 04:21 AM
jimparsons1
Emerging Contributor
I am producing 1000s of small viewsheds. I have built a large python scripting file.

After a few thousand pywin crashes with no error warning. After a while I figured out why - in the viewshed raster output folder there's a subfolder called "info" full of files such as arc0002.dat or .nit

When you reach arc9999.dat you can create no more rasters and pywin just crashes with no warning. A hard crash.

So I manually deleted the contents of the info folder and now it magically works again.

So some fool has decided that folder should never be bigger than 9999 items. Would someone please fix this bug?

Last time I reported a bug http://forums.arcgis.com/threads/4524-Symbology-gt-Stretched-is-not-smooth-at-all noone at ESRI even bothered to reply. I guess they're not interested in defects in their product.

By posting this here perhaps others might understand why large geoprocessing operations fail. Thanks
0 Kudos
3 Replies
jimparsons1
Emerging Contributor
OK I did more testing, I can confirm that viewshed won't save a raster to a folder if the "info" folder has already reached arc9999.dat.  This doesn't mean you're allowed 9999 rasters in that folder, far from it. You'll only get about 2500, for some reason the dat files grow quicker than the actual number of rasters. Sloppy coding in the extreme.

My workaround is to create a new directory every 1000 viewsheds. I wrote a VB2008 program to produce my .py batch scripts. It's very easy to iterate in VB to create new filenames, and then execute the py file in pywin.

My aim is to leave this running overnight, and with the new folder every 1000 rasters hopefully it won't crash. We'll see in the morning!

Either way, there is a massive bug. Someone decided dat files in the info folder should max out at 9999, and that was stupid. I'm currently creating 13600 viewsheds, and this is just a pilot. Batch tools should be able to handle enormous batches, they shouldn't crash because of some naming restriction.

It's a shame python geoprocessing is so slow. It's only mildly taxing one of my CPU cores. If they could write ArcGIS to work CPUs efficiently I could finish this job in a fraction of the time.

Still, ESRI is not exactly known for efficiency is it?
0 Kudos
ChrisSnyder
Honored Contributor
You are doing what I do, build folders every so often. Another, but slower, option is to store the rasters in a FGDB, which I believe has no raster storage limit, but has the added overhear of reformatting the grid to FGDB format.

I have to agree: ESRI has made it very hard to process rasters efficiently! Some of the issue is that many of the underlying algorithms (file I/O and what not) were written 20+ years ago, and are not really optimized for modern computes. My main gripe however, has been that you can't run two or more ArcGIS Spatial Analyst tools concurrently on the same machine. I have written Python code that basically does vector and tabular parallel processing (tiles a dataset up into pieces and runs each piece concurrently), but for some reason you can't do that with the SA tools! It's sad/funny since you CAN do this in workstation ArcInfo! I have reported/complained to ESRI about this for quite a while and never gotten a good answer. Last I heard it was in their user advocacy group...
0 Kudos
jimparsons1
Emerging Contributor
Thanks Chris. Yes, I read in another bug report about the number of rasters in a folder that ESRI were unwilling to tinker with the vintage code in case it broke anything else.

It's quite staggering just how antiquated so much of this application is. I once saw a presentation by Paul Hardy of ESRI UK and he seemed to be bragging that they had doubled their source code before the release 9.1

If only there was decent competition in this field then we'd get GIS software that is reliable, bug free and most importantly fast and efficient, using multicores which have been around for years and years...
0 Kudos