Select to view content in your preferred language

Java Maps SDK Memory Leak?

2415
18
04-14-2023 08:20 AM
JaredFaulk
New Contributor III
The Problem
I am building a GIS module for a system using ArcGIS Java Map SDK v200.0.0. The general functionality flow is, I connect to a few Esri servers, pull down some features/data using their APIs, run a few calculations, and write to a file. Since building my module and running the calculations on a local test server I noticed our virtual memory usage kept climbing higher and higher with each subsequent calculation (until ultimately our application crashes). So clearly this sounds like some form of  memory leak.
 
What I've Done
Firstly, I investigated the memory usage of the JVM, thinking perhaps some large objects are not being garbage collected. I mapped our memory usage in VisualVM as seen here:JaredFaulk_0-1681484882344.png

 

However, everything is performaing as expected. Memory usage spikes for each new calculation, and then all unreferenced objects are garbage collected at the end of the calculation. I also checked our metaspace usage, but that never exceeds 50MB (so metaspace is not the issue). I therefore am running into some form of native memory leak.
 
Because the memory leak is outside of the JVM, the most typical culprits are some form of file stream not being closed. I do write to a file at the end of my calculation but I DO close it as seen here:
 

 

. . .
//map here is an esri ArcGISMap object that holds some data and must be loaded fully
map.addDoneLoadingListener(() -> { 
			if (map.getLoadStatus() == LoadStatus.LOADED) { 
				String mapJson = map.toJson(); 
				FileWriter jsonFileWriter = null; 
				try {
                                        //file here is a valid File already created
					jsonFileWriter = new FileWriter(file); 
					jsonFileWriter.write(mapJson); 
					mapDoneWriting = true; 
				} catch (IOException e) { 
					e.printStackTrace(); 
				} finally { 
					try { 
						if (jsonFileWriter != null) { 
							jsonFileWriter.flush(); 
							jsonFileWriter.close(); 
						} 
					} catch (IOException e) { 
						e.printStackTrace(); 
					} 
				} 
			} else 
				throw new IllegalArgumentException("Error writing json map file"); 
});

. . .

 

 
To further chase this leak I discovered this blog [here] and [here] which seemed to have a very similar obscure memory leak. The blog posts in summary: other large java systems had a native memory leak that could not be found, and both blogs successfully debugged and patched the leak using a tool call jemalloc (spoiler, the culprit was some form of Inflator/Deflator Object used for compression/decompression not being closed) Jemalloc is essentially a memory allocator exactly like malloc, but with further debugging functionalities added. I replaced the JVM's default malloc to use jemalloc and then created memory usage reports in a tree structure using jeprof (a built in reporting tool with jemalloc). Now here is where my debugging is hitting a road block.
 
Here are the reports generated (zoomed into the most likely culprits):
JaredFaulk_1-1681485225247.png

 

Zooming into some weird potential culprits:
JaredFaulk_2-1681485270394.png

 

And further weirdness:
JaredFaulk_0-1681485360219.png

 

Here is where I need some help. I am interpretting these results based solely off of the other examples I have seen from similar results (i.e. each blog post I link shows an example, I could not find documentation detailing interpreting this graph). From my understaning, the bottom percentage is the percentage of total memory used from my application from that method. Which then points to 90% of my memory used at RT_Vector_setElementRemovedCallback, and 70% of that is in lerc_decodeToDouble (?). I found the LERC project, which has a github repository: Lerc Repository Then I also found the 'decodeToDouble' function which is here: decodeToDouble.
 
MY QUESTION IS:
Does this appear to be a Esri memory leak in the SDK I am using? Or am I interpreting these results wrong? If this is a memory leak, this is potentially a serious bug in the Java Maps SDK. Thank you for your feedback.
0 Kudos
18 Replies
MarkBaird
Esri Regular Contributor

Just to update you, we are debugging this currently so will report back on what we find.

I would however be interested to find out a little more about the application you are wiring.  Is this a UI based application (with a JavaFX user interface), or is it a back end server app?

0 Kudos
JaredFaulk
New Contributor III

Awesome, thank you Mark. So this portion of code is located in our backend, solely used for calculation purposes. No UI interactions or MapViews in this portion of our system. 

0 Kudos
MarkBaird
Esri Regular Contributor

Just wanted to update you that we've managed to reproduce the issue.  It's an unusual issue in that things work fine for several hours on Windows and Mac, but we only see the growing memory use on Linux.

Not tracked it down yet, but Linux based investigations are ongoing.

0 Kudos
JaredFaulk
New Contributor III

Great, glad you guys were able to reproduce the issue. I probably should have mentioned that I am using Linux for our test environment. Best of luck debugging!

0 Kudos
JaredFaulk
New Contributor III

Good morning, I just wanted to follow up on the status of this issue. Thanks.

0 Kudos
MarkBaird
Esri Regular Contributor

We are still investigating it.  I will report back once we've got a better handle on it.

 

0 Kudos
JaredFaulk
New Contributor III

Sounds good. Thank you for the response.

0 Kudos
JaredFaulk
New Contributor III

Howdy Mark, I believe I have tracked down the issue. I believe what is causing the memory leak is the GeometryEngine.union() method. I have some reproducer code I have sent support, and I have also pushed the new reproducer code to a branch called 'union' in the github.

0 Kudos
MarkBaird
Esri Regular Contributor

@JaredFaulk 

Yes I'm talking to TJ from support and have your latest reproducer which looks like it will help us to narrow down the issue.  Thanks for putting this together.