Simplify Line Tool throws Out of Memory errors at under 25% usage

3653
10
03-07-2019 11:09 AM
BillSchlatter
Occasional Contributor

Short version: The Simplify Line tool is failing for me, throwing Out of Memory errors.  I'm running this on a system with 64 GB of RAM, and the Task Manager reports never getting above about 14 GB used across all processes before the tool fails.  Does anyone know why this might be happening, or how I could prevent it from happening again?

Long-winded explanation:

I have a series of 1-foot contour line datasets for several counties in my region.  They are pretty heavy datasets, ranging from 1.7GB to 4.8GB apiece.  I'm currently trying to use the Simplify Line tool to reduce their complexity, improving map performance and ease of distribution, but I'm running into issues with the tool.  In particular, I keep getting Error 000426: Out of Memory (000426: Out Of Memory.—Help | ArcGIS Desktop). 

I have watched the system resource usage in Windows Task Manager, and it's not even close to running out.  I'm running it on a remote system with 64 GB of RAM, and the highest I've seen the total system usage get before the tool fails is 14 GB, with 6-8 GB being far more common.  I've double-checked with our IT department, and there's no limitations no how much RAM a single user can use on that system, and I have this problem even when I'm the only user logged in. 

I am using cartographic partitions generated using the Create Cartographic Partitions tool to try to reduce the amount of data in memory to a more reasonable size.  I've tried generating minimum-size partitions (500 features per partition), and the tool still eventually fails, after succeeding at a number of low-density partitions.  The same thing happens if I filter out 80% of the lines to show only elevation multiples of 5 and run it with those tiny partitions. 

Through trial and error, I've found that I can get the tool to complete successfully on the smallest, lowest-density county if I filter the features to show only elevation multiples of 25, hiding 96% of the lines. That is why I'm trying to figure out how to make these settings work - those results look good, while the output of the Generalize tool (which does complete successfully) looks jagged.  Also, I don't want to have to merge together 25 result datasets per county, and that's a conservative estimate, because it's likely the larger, hillier counties would need to be filtered even further.  

I am using the following settings:

Simplification Algorithm: Retain Critical Bends (Wang-Muller)

Simplification Tolerance: 10 Feet

Everything else (besides Cartographic Partitions) is default. 

Edit to add: I'm currently using ArcGIS Pro 2.3.0. 

0 Kudos
10 Replies
DanPatterson_Retired
MVP Emeritus
MichaelVolz
Esteemed Contributor

Do you have the ability to run this process in a version of ArcMap (lets go old school) to see if you get better results (get some work done)?

0 Kudos
BillSchlatter
Occasional Contributor

I hadn't, just because I don't like ArcMap very much so it didn't even occur to me as a possible alternative. 

I have now tried it in ArcMap with the same settings as one of my fail test cases in Pro.  ArcMap fails, too, and on an earlier cartographic partition than Pro does.  ArcMap's memory consumption appears to be even lower at the time of failure, which isn't a surprise given the limitations of 32-bit architecture.  64-bit software like Pro should, in theory, be able to use far more memory than the 64 GB on my system, but I'm starting to wonder if there is a limit on how much Pro can use, whether by design or by accident. 

0 Kudos
ThomasColson
MVP Frequent Contributor

"Not a good day for contours.." Hilarious. 

Are you running this out of a FGDB? If so, is it on a network share or local? If local, what type of hard drive? SSD? Spinning Platter Antique?

0 Kudos
BillSchlatter
Occasional Contributor

I'm running it out of a file geodatabase, and it's on a network drive. 

0 Kudos
VinceAngelo
Esri Esteemed Contributor

Network drives are bad juju for file geodatabase.  Always use a local drive for I/O-intensive operations.

- V

CraigWilliams
Esri Contributor

In many contour datasets, the features are very large with a single contour value being one long winding feature with a large extent. This may explain why even with 500 feature you're hitting memory limits. It's also the reason why the Contour tool now has a Maximum vertices per feature parameter. For existing contours, you may want to run the Dice tool to make these larger geometries smaller.

If generating contours from scratch, I'd highly recommend preprocessing the input raster with a Focal Statistics operation with the Mean option or the Filter tool with the Low option. This will create smoother contours from the start and will produce better results than generating contours from a noisy DEM and attempting to smooth or simplify them after the fact.

RyanKelso
Occasional Contributor III

I have encountered this same problem with the out of memory error.  I'm also building contour lines and trying to run Simplify Line to shrink the files at get rid of unnecessary vertices.

The thing is, I went through this exact same process six years ago, and the tools ran without a hitch.  That would have been ArcGIS Desktop 10.0 or 10.1 with hardware that is more than six years older than the hardware I'm running on today.  No cartographic partitions, no dicing up lines or setting a max number of vertices per feature, it just worked.  I've tried this with ArcGIS Pro 2.4.1 and Desktop 10.6.1.  It's not just Simplify Line that is getting the error, the Contour tool is also getting the error in some cases.  I've not gotten the error with any other tools.

My process already incorporates smoothing the DEMs before building contours by using resampling and filtering functions.  Actually this time around I am using ArcGIS Pro's Contour function with its Adaptive Smoothing, which I definitely recommend over using a standard filtering operation.

This led me to trying it on some older installations of ArcGIS.  On one machine running Desktop 10.5.1, I ran into the same out of memory errors.  On another machine I tried 10.2.2 and 10.4.1 and was able to successfully complete the processes that were failing on the other machines running newer versions of ArcGIS.  None of the machines were ever actually close to running out of available memory.

0 Kudos
MikeBrouillette
New Contributor

Same problem w/SIMPLIFY out of memory errors so I followed RKelso lead and installed 10.4.1 on a spare workstation and can confirm that using Desktop ArcGIS version 10.4.1 for the SIMPLIFY command works successfully compared to either 10.7.1 or Pro 2.4.1.  With the later versions I got the same "Out of memory error" when the feature count hit 38,100, whereas 10.4.1 cranked through all 464,000 features never using more than 30% memory. Disappointing (and time consuming) that stable and productive tools developed in earlier versions can't at the very least retain basic functionality in subsequent releases. 

Quick follow up - Initially I was testing a small enough area in 10.7.1/Pro 2.4.1 that I didn't hit the memory wall and had found the "Weighted Area" algorithm to be the best option for reducing vertices/file storage (85%!) while retained decent feature geometry. Unfortunately, version 10.4.1 only has the "POINT_REMOVE" and "BEND_SIMPLIFY" algorithms....so I uninstalled 10.4.1, installed 10.5.1 (WEIGHTED_AREA introduced at this version) and ran the same model and it threw an error at only 27000 features (out of the 464,000)! So looks like the issue came in after version 10.4.1 and thus the Weighted_Area algorithm out of reach.....

0 Kudos