RuntimeLocalServer memory consumption on Intersect

1980
1
12-12-2016 07:07 AM
BledarBirbo1
Occasional Contributor

Hi.

We are intersecting large amounts of features form a filegeodatabse from a python script and we are using 64 bit background proccesing. The machine has 16GB Ram and 12GB available when the script starts.

The issue is that the Intersect tool fails to run sometimes, especially when it has previusly run once.

We have observed that the RuntimeLocalServer.exe instance consumens too much memory reaching 12GB and then tool crashes. 

Each time the script crashes the RuntimeLocalServer.exe memory consumption is about 3GB (it seems to be leaking memory)

My questions are:

1-) What is the maximum abount of memory that RuntimeLocalServer.exe is allowed to use ?

2-) Is the Intersect tool expected to be so much memory hungry ?

3-) What are the best practices for managing memory in such cases when there are multiple tools running in the python script ? How should we clrear out the memory from the previus tool ?

Thanks in advance.

0 Kudos
1 Reply
KevinHibma
Esri Regular Contributor

There is no maximum amount of memory the RuntimeLocalServer process can use. It depends on 2 things: 32bit/64bit and the tool being run. The RuntimeLocalServer process is the same process if you're using regular 32bit Background. In this case its bound by the operating system rules of about 3GBs. If you're using 64bit Background, then it could possible use as much RAM as you have on the system. Of course this is all tool dependent. Some tools no matter how big the dataset may only use a few hundred MBs, or other tools may want to use and load as much into the RAM as possible.

Its hard to talk in absolutes, but generally the Intersect tool will make use or available RAM, thus you could see high utilization.

If there actually is a memory leak the only thing you can do is bring down the .exe and start it back up again. Understand though, there may not actually be a memory leak here, it could just be the tool is trying to use as much memory as possible to get the job done. A leak would be more indicative if you run the tool start to finish and after it finishes the process still has GBs of memory. Having a few hundred MBs would be normal. But these are generalized statements, each case (computer, OS, version, data, etc) will have a different "correct" answer.

I'd actually give this blog a read, I think its very relevant to what you're doing:  Dicing Godzillas (features with too many vertices) | ArcGIS Blog 

Be successful overlaying large, complex datasets in Geoprocessing | ArcGIS Blog 

Tiled processing of large datasets—Help | ArcGIS for Desktop 

0 Kudos