Simplify Line Tool throws Out of Memory errors at under 25% usage

3712
10
03-07-2019 11:09 AM
BillSchlatter
Occasional Contributor

Short version: The Simplify Line tool is failing for me, throwing Out of Memory errors.  I'm running this on a system with 64 GB of RAM, and the Task Manager reports never getting above about 14 GB used across all processes before the tool fails.  Does anyone know why this might be happening, or how I could prevent it from happening again?

Long-winded explanation:

I have a series of 1-foot contour line datasets for several counties in my region.  They are pretty heavy datasets, ranging from 1.7GB to 4.8GB apiece.  I'm currently trying to use the Simplify Line tool to reduce their complexity, improving map performance and ease of distribution, but I'm running into issues with the tool.  In particular, I keep getting Error 000426: Out of Memory (000426: Out Of Memory.—Help | ArcGIS Desktop). 

I have watched the system resource usage in Windows Task Manager, and it's not even close to running out.  I'm running it on a remote system with 64 GB of RAM, and the highest I've seen the total system usage get before the tool fails is 14 GB, with 6-8 GB being far more common.  I've double-checked with our IT department, and there's no limitations no how much RAM a single user can use on that system, and I have this problem even when I'm the only user logged in. 

I am using cartographic partitions generated using the Create Cartographic Partitions tool to try to reduce the amount of data in memory to a more reasonable size.  I've tried generating minimum-size partitions (500 features per partition), and the tool still eventually fails, after succeeding at a number of low-density partitions.  The same thing happens if I filter out 80% of the lines to show only elevation multiples of 5 and run it with those tiny partitions. 

Through trial and error, I've found that I can get the tool to complete successfully on the smallest, lowest-density county if I filter the features to show only elevation multiples of 25, hiding 96% of the lines. That is why I'm trying to figure out how to make these settings work - those results look good, while the output of the Generalize tool (which does complete successfully) looks jagged.  Also, I don't want to have to merge together 25 result datasets per county, and that's a conservative estimate, because it's likely the larger, hillier counties would need to be filtered even further.  

I am using the following settings:

Simplification Algorithm: Retain Critical Bends (Wang-Muller)

Simplification Tolerance: 10 Feet

Everything else (besides Cartographic Partitions) is default. 

Edit to add: I'm currently using ArcGIS Pro 2.3.0. 

0 Kudos
10 Replies
RyanKelso
Occasional Contributor III

That's too bad about the weighted area algorithm, Mike.  Frustrating.  I came to the same conclusion that this was broken after the 10.4.1 release.  I found a bug report BUG-000123000 that indicates this issue has been fixed for version 10.8.  Hopefully they fixed whatever the underlying problem is because I also encountered it with the Contour tool and suspect it's not limited to these two tools only.

It's not difficult to resent Esri for their cycle of breaking existing features, letting them fester for a few years, forcing users to upgrade to the latest release if we want fixes instead of releasing patches for older versions, and leaving us to find the next round of broken features... the cycle repeats.

0 Kudos