I have multiple-year Lidar coverage of a riverine system consisting of about 1.5 billion points in about 100 (quarter quad) LAS tiles for each year. I also have stereophotogrammetrically-derived water polygons made from lidar intensity image derivatives for each year. Some of the water polygons are for the river itself and span many of the LAS tiles. I want to reclassify the points falling within water polygons to class "9" in all LAS tiles.
I did this by running the "Set LAS Class Code Using Features" geoprocessing tool on a LAS dataset created for the entire directory for a particular year.
I computed statistics for the LAS dataset before running the tool on it, and checked the "update statistics" box.
I'm running Win7-64, dual Xeon quads, 12GB RAM. Arc10.1 SP1. ARc10.1SP1-64bit background processing.
I started the tool running at noon yesterday, and I can tell by checking file date modified in WinExplorer that only about half of the files have been processed by 10 hours later.
So what's my best strategy for breaking up the work to run more efficiently? I can cut up the polygons on the tile seamlines. Or I could create a series of LAS datasets for sub-sets of LAS tiles and run the tool on those. I haven't test the tool on a LASd containing only 1 or 2 tiles yet.
Why does it run so slow? Windows Resource Monitor indicates that CPUs are running at 6%, 6 GB of memory are free, and the disk activity is minimal. I do notice that the disk is going to pagefile.sys pretty frequently, which seems odd given the free RAM available.