Memory usage for ArcMap - System.OutOfMemoryException //c#

5947
11
03-19-2013 06:53 AM
MarcinDruzgala
Occasional Contributor
Hi the problem is common as far as I know from reading post from this and other forums(memory usage for arcgis).
I'm developing a add-in for ArcMap that checks electricity flow in Geometric Network. I've created a loop for over 14 000 start points for my analysis. I'm using Utility Network Analyst and the FindFlowElements method, the algorithm is preety simple:
1. Get Geometric Network IDs for starting point(FCID, FID, subID)
2. Configure Geometric Network
3. FindFlowElements -> as a result i get junctionEIDs and edgeEIDs
4. Create selection from the result
5. Analyze the selection
6. Log info
7. Clearing the result
8. Clearing the flags for network utility
9. Clearing the selection for map
10. Partial refresh
11. Marshal.FinalReleaseComObject(traceFlowSolver); // testing it because of the out of memory exception
and again from the point 1 for next point.

So when we tested this we encountered a lot of errors. I have the log info we are writing to txt file it will be easier to illustrate the problem with example:
INFO 2013-03-19 12:56:12,302 â?? Called FindFlowElements method 1374 time(s)
Number of objects in List<T> SN = 1316, memory usage = 0,06808 MB
Number of objects in List<T> nN = 10068, memory usage = 0,43913 MB
Station number 7545
Process name: ArcMap -> memory use 834,5586 MB || GC TotalMemory -> 80,0527 MB
Processes list:
IEXPLORE ||memory use-> 31,6836 MB
FSM32 ||memory use-> 15,6992 MB
SVCHOST ||memory use-> 26,4336 MB
EXPLORER ||memory use-> 85,4766 MB
SMSVCHOST ||memory use-> 37,5313 MB
DSTERMSERV ||memory use-> 27,0664 MB
FSSM32 ||memory use-> 167,2422 MB
SVCHOST ||memory use-> 48,6758 MB
SVCHOST ||memory use-> 82,5117 MB
FSHDLL32 ||memory use-> 52,4531 MB
BCU ||memory use-> 85,0273 MB
ORACLE ||memory use-> 771,1250 MB
SPLWOW64 ||memory use-> 15,6484 MB
IEXPLORE ||memory use-> 15,7031 MB
ARCMAP ||memory use-> 964,2070 MB
SEARCHINDEXER ||memory use-> 50,6445 MB
TNSLSNR ||memory use-> 36,6602 MB
WINWORD ||memory use-> 22,3398 MB
FSGK32 ||memory use-> 15,4453 MB
SOFFICE.BIN ||memory use-> 16,2188 MB
AUDIODG ||memory use-> 18,5859 MB
IEXPLORE ||memory use-> 19,3359 MB
SPOOLSV ||memory use-> 17,4063 MB
WMPNETWK ||memory use-> 30,1172 MB
SVCHOST ||memory use-> 29,7500 MB
SVCHOST ||memory use-> 234,8750 MB
DWM ||memory use-> 45,6406 MB
SVCHOST ||memory use-> 25,8594 MB
CONNECT.SERVICE.CONTENTSERVICE ||memory use-> 51,8594 MB
Memory usage sum = 2911,0120 MB
=================================================================================================

INFO 2013-03-19 12:56:14,954 â?? Called FindFlowElements method 1375 times
Number of objects in List<T> SN = 1316, memory usage = 0,06808 MB
Number of objects in List<T> nN = 10072, memory use = 0,43931 MB
Station number 7397
Process name: ArcMap -> memory use 838,7539 MB || GC TotalMemory -> 84,8050 MB
Processes list:
x
x
x
x
Memory usage sum = 2899,1410 MB

The FindFlowElementos method was called 1375 times(14 000 expected...) and ArcMap already uses ~840 MB(when it reaches ~2GB arcmap throws an error) of memory and it's growing with every iteration. Sometimes it drops with few MB but no to often so i will propably have to do the cleaning manually, but the question is how?

I will explain now how i calculated/got the memory usage:
1.For ArcMap:
private const float mbyte = 1048576;
Process proc = Process.GetCurrentProcess();
string privMemSize = (proc.PrivateMemorySize64 / mbyte).ToString("0.0000");


2.For Garbage Collector:
string gcTotalMem = (GC.GetTotalMemory(false) / mbyte).ToString("0.0000");


3.Memory usage for Lists<T> I'm calculating with method I took from StackExchange:
private static string GetObjectSize(object TestObject)
{
 try
 {
  BinaryFormatter bf = new BinaryFormatter();
  MemoryStream ms = new MemoryStream();
  byte[] Array;
  bf.Serialize(ms, TestObject);
  Array = ms.ToArray();
  return ((float)Array.Length / mbyte).ToString("0.00000");
 }
 catch (Exception ex)
 {
  log.Error("========================================================================\n"
   +ex.Message + "\n" + ex.StackTrace + "\n"
   + "========================================================================\n");
  return "error";
 }
}


The question is how I can handle the increasing memory usage for ArcMap.exe? Anyone got idea?
0 Kudos
11 Replies
RichardWatson
Frequent Contributor
ArcMap is a 32 bit process.  By default it can access 2 ** 32 / 2 bytes of memory.  If you run it on a 64 bit machine with a 64 bit OS then it can access 2 ** 32 bytes of memory.  In essence, you get 4 GB instead of 2 GB.

I didn't see where you specified what version of ArcMap you are using.  In earlier versions you have to literally set the large address aware bit on ArcMap.exe in order to make this work.
0 Kudos
AlexanderGray
Occasional Contributor III
What Richard is saying about 32 bit and large address aware is very true.  The obvious question then is why is your application taking so much memory and how can you reduce it.  Without seeing the code of the loop it is hard to point to one thing or another.  Marshall com release must be done on any cursors and usually on the features or rows in the loop too.  I don't know if you use recycling on the cursor or if you limit the fields.  You also don't mention the type of database (large selections on an enterprise gdb are done on the database.)  If you can reduce the amount the memory grows per iteration, you should be able to process the entire dataset.
0 Kudos
JasonPike
Occasional Contributor
Code examples would help us to identify the problem. Also, you can use tools like CLR Profiler, UMDH, and LeakDiag to quickly identify which objects are consuming the most memory. Once you know which ones are consuming the most memory, you can focus your attention on getting them released properly. Or, you can do what I do, and make sure everything is released immediately after it is no longer needed--this approach is error prone, so be very careful if you try it.
0 Kudos
MarcinDruzgala
Occasional Contributor
Thanks for replying, my mistake that i didn't post the machine specification. We are testing this utility on two PC:
1. Windows Server 2008 R2 Foundation 64 bit
Intel Xeon X3430 @ 2.40 GHz
8 GB RAM
2.Windows 7 Proffessional 64 bit
Intel i7-2600K @ 3.40GHz
8 GB RAM
On both PC the version of ArcGIS is 10.0.

ArcMap is a 32 bit process. By default it can access 2 ** 32 / 2 bytes of memory. If you run it on a 64 bit machine with a 64 bit OS then it can access 2 ** 32 bytes of memory. In essence, you get 4 GB instead of 2 GB.

Yep I've read that arcmap is native 32-bit process and it can use 4GB of ram BUT still I dont think in that process I need so much memory..

In case of releasing the COM object, we are doing it every time when we use a COM object with ReleaseComObject method. Since we wanted to fasten the application and shorten the time needed to analyze so much data at the beggining we are creating dictionaries from feature attributes table(of course not all attributes, just the one we need). The code sample you are asking for I'm attaching in txt file.

I did today another hard test with this utility and something really weird is happening with ArcMap, I will show you few Inof's from log file(noone else was using any of the PC's in our office):

INFO 2013-03-19 08:18:57,083 â?? Called FindFlowElements method 3213 time(s)
Process name: ArcMap -> memory usage 1588,7460 MB || GC TotalMemory -> 264,3199 MB

INFO 2013-03-19 08:19:00,717 â?? Called FindFlowElements method 3214 time(s)
Process name: ArcMap -> memory usage 1492,2500 MB || GC TotalMemory -> 176,0005 MB

INFO 2013-03-19 08:19:04,430 â?? Called FindFlowElements method 3215 time(s)
Process name: ArcMap -> memory usage 1611,6560 MB || GC TotalMemory -> 297,3569 MB


Only 7seconds and so much difference in memory usage...that's weird isn't it?

Later get's better;) ->
INFO 2013-03-19 10:18:29,839 â?? Called FindFlowElements method 4823 time(s)
Process name: ArcMap -> memory usage 2010,1130 MB || GC TotalMemory -> 221,0689 MB

INFO 2013-03-19 10:18:40,649 â?? Called FindFlowElements method 4824 time(s)
Process name: ArcMap -> memory usage -1884,5430 MB || GC TotalMemory -> 423,0786 MB

INFO 2013-03-19 10:18:44,892 â?? Called FindFlowElements method 4825 time(s)
Process name: ArcMap -> memory usage -1682,0350 MB || GC TotalMemory -> 620,8250 MB

INFO 2013-03-19 10:18:49,229 â?? Called FindFlowElements method 4826 time(s)
Process name: ArcMap -> memory usage 1976,2620 MB || GC TotalMemory -> 183,4406 MB


Funny(or maybe it's normal??)? At the same time notice the increase of Garbage Collector managed memory and 5 sec later it's down over 400MB. I'm not the expert in memory managing but it looks weird.

If you got any idea about this please share you knowledge I'm very curious about this.
0 Kudos
JasonPike
Occasional Contributor
Thanks for replying, my mistake that i didn't post the machine specification. We are testing this utility on two PC: 
1. Windows Server 2008 R2 Foundation 64 bit 
Intel Xeon X3430 @ 2.40 GHz 
8 GB RAM  
2.Windows 7 Proffessional 64 bit 
Intel i7-2600K @ 3.40GHz 
8 GB RAM 
On both PC the version of ArcGIS is 10.0. 

  
Yep I've read that arcmap is native 32-bit process and it can use 4GB of ram BUT still I dont think in that process I need so much memory.. 

In case of releasing the COM object, we are doing it every time when we use a COM object with ReleaseComObject method. Since we wanted to fasten the application and shorten the time needed to analyze so much data at the beggining we are creating dictionaries from feature attributes table(of course not all attributes, just the one we need). The code sample you are asking for I'm attaching in txt file. 

I did today another hard test with this utility and something really weird is happening with ArcMap, I will show you few Inof's from log file(noone else was using any of the PC's in our office): 

INFO 2013-03-19 08:18:57,083 â?? Called FindFlowElements method 3213 time(s)
Process name: ArcMap -> memory usage 1588,7460 MB || GC TotalMemory -> 264,3199 MB

INFO 2013-03-19 08:19:00,717 â?? Called FindFlowElements method 3214 time(s)
Process name: ArcMap -> memory usage 1492,2500 MB || GC TotalMemory -> 176,0005 MB

INFO 2013-03-19 08:19:04,430 â?? Called FindFlowElements method 3215 time(s)
Process name: ArcMap -> memory usage 1611,6560 MB || GC TotalMemory -> 297,3569 MB


Only 7seconds and so much difference in memory usage...that's weird isn't it? 

Later get's better;) -> 
INFO 2013-03-19 10:18:29,839 â?? Called FindFlowElements method 4823 time(s)
Process name: ArcMap -> memory usage 2010,1130 MB || GC TotalMemory -> 221,0689 MB

INFO 2013-03-19 10:18:40,649 â?? Called FindFlowElements method 4824 time(s)
Process name: ArcMap -> memory usage -1884,5430 MB || GC TotalMemory -> 423,0786 MB

INFO 2013-03-19 10:18:44,892 â?? Called FindFlowElements method 4825 time(s)
Process name: ArcMap -> memory usage -1682,0350 MB || GC TotalMemory -> 620,8250 MB

INFO 2013-03-19 10:18:49,229 â?? Called FindFlowElements method 4826 time(s)
Process name: ArcMap -> memory usage 1976,2620 MB || GC TotalMemory -> 183,4406 MB


Funny(or maybe it's normal??)? At the same time notice the increase of Garbage Collector managed memory and 5 sec later it's down over 400MB. I'm not the expert in memory managing but it looks weird. 

If you got any idea about this please share you knowledge I'm very curious about this.


I'll address the methods one at a time as needed. I really don't understand why you are interested in the memory usage totals. You should be concerned with what objects are using the most memory. Anyway, here is the first method:

private static void GetSelectedFeatures(IFeatureLayer featLayer, string layerName, Dictionary<string, string> dc)
{
 ICursor cursor = null;
 IRow row = null;
 try
 {
  cursor = Utilities.GetCursor(featLayer);
  row = cursor.NextRow();
  while (row != null)
  {
   string nadrzedny = Utilities.GetStringValueFromField(row, FIELD_NAME_NADRZEDNY);
   string uniqueNR = Utilities.GetStringValueFromField(row, FIELD_NAME_UNIKALNY);
   if (!dc.ContainsKey(nadrzedny))
    dc.Add(nadrzedny, uniqueNR);
   
   // release each instance of row  
   Marshal.ReleaseComObject(row);
   // set it to null in case the next line throws an exception
   // don't want to release it a second time in the finally-block
   row = null; 
   row = cursor.NextRow();
  }
 }
 finally
 {
  // row should be null unless an exception occurred, in which case
  // the last allocated instance of row should still be released
  if( row != null )
   Marshal.ReleaseComObject(row);
   
  Marshal.ReleaseComObject(cursor);
 }
 
 GetFinalListOfFeatures(layerName);
}
0 Kudos
JasonPike
Occasional Contributor
I'll address the methods one at a time as needed. I really don't understand why you are interested in the memory usage totals. You should be concerned with what objects are using the most memory. Anyway, here is the first method:

private static void GetSelectedFeatures(IFeatureLayer featLayer, string layerName, Dictionary<string, string> dc)
{
 ICursor cursor = null;
 IRow row = null;
 try
 {
  cursor = Utilities.GetCursor(featLayer);
  row = cursor.NextRow();
  while (row != null)
  {
   string nadrzedny = Utilities.GetStringValueFromField(row, FIELD_NAME_NADRZEDNY);
   string uniqueNR = Utilities.GetStringValueFromField(row, FIELD_NAME_UNIKALNY);
   if (!dc.ContainsKey(nadrzedny))
    dc.Add(nadrzedny, uniqueNR);
   
   // release each instance of row  
   Marshal.ReleaseComObject(row);
   // set it to null in case the next line throws an exception
   // don't want to release it a second time in the finally-block
   row = null; 
   row = cursor.NextRow();
  }
 }
 finally
 {
  // row should be null unless an exception occurred, in which case
  // the last allocated instance of row should still be released
  if( row != null )
   Marshal.ReleaseComObject(row);
   
  Marshal.ReleaseComObject(cursor);
 }
 
 GetFinalListOfFeatures(layerName);
}


Can you post the entire class that contains these methods? It looks like there are lots of class fields (globals) that there isn't any context for in the sample you provided. You should definitely avoid class fields and particularly static class fields because the may cause objects to stick around longer than necessary. Any object stored in a static class field in a static class is going to stick around until the AppDomain shuts down, which happens when the process is torn down.
0 Kudos
MarcinDruzgala
Occasional Contributor
Thanks a lot Jason for your tips, I will try your approach about releasing COM objects -> I thought that if I release only the ICursor object the IRow object would be destroyed too, but seems like I was wrong. Next thing I'm apparently wrong about is that I thought releasing COM objects inside a loop would be an error.

About global static objects you are right I've got 'few' of them...I'm aware I should propably re-write the whole utility. I've started to write it few months ago(5-6), then I had like 4 month pause and now when I'm back with it I just want to sit down and re-write the whole thing but there is no time.

I will change few things, run next test and I will post the results here.
0 Kudos
MarcinDruzgala
Occasional Contributor
Code examples would help us to identify the problem. Also, you can use tools like CLR Profiler, UMDH, and LeakDiag to quickly identify which objects are consuming the most memory. Once you know which ones are consuming the most memory, you can focus your attention on getting them released properly. Or, you can do what I do, and make sure everything is released immediately after it is no longer needed--this approach is error prone, so be very careful if you try it.


Ok I've used the LeakDiag to investigate the memory usage problem but honestly the result doesn't say much to me. Can you please take a look at them? I'm attaching xml files in .rar so if you could take a look at them at least in the LDGrapher application.
I did tests for 2 types of Memory allocators:
1.Windows Heap Allocator (4 logs, logged every 5 minutes)
2.COM Allocator (same: 4 logs/5min interval)

ArcMap was already using ~1,8 GB of RAM.

I'm attaching the graphs I got from loading this logs into LDGrapher with parameters opened for the most memory consuming lines.

[ATTACH=CONFIG]22872[/ATTACH][ATTACH=CONFIG]22873[/ATTACH]


Regards,
MDruzgala
0 Kudos
JasonPike
Occasional Contributor
Those logs are missing a lot of information that we need. Have you set up your symbol path?

In LeakDiag, you use the following steps:

1) Tools -> Options

2) Set the symbol search path to this:
cache*C:\dev\symbols;SRV*C:\dev\symbols*http://msdl.microsoft.com/download/symbols;SRV*C:\dev\symbols*http://downloads2.esri.com/Support/symbols/


3) Check the "Resolve symbols when logging."

4) Change the max stack depth to 32.

For most other tools (UMDH, WinDbg, etc.) you add a User Environment Variable called _NT_SYMBOL_PATH and set its value to the search path described above.

I'm afraid the ArcMap will choke LeakDiag with these settings--it works great if you can create a small test project that reproduces the problem, though. I typically use UMDH if I have no choice but to use full-blown ArcMap.

LeakDiag and UMDH give us a good look at unmanaged code--it is showing the managed code's effects indirectly, but a log from CLRProfiler would give us a different perspective that may also be useful.

Finally, what I was able to gather from you logs was that there seems to be a lot of unmanaged strings allocated. Before you log anything else, you need to add a system environment variable called OANOCACHE and set its value to 1 to disable string caching so that we get an accurate picture of what code is responsible for their allocation.
0 Kudos