Select to view content in your preferred language

Extremely High Memory Usage Iterating through Features in Java Maps SDK

922
4
06-30-2023 02:32 PM
JaredFaulk
Emerging Contributor
Previously  here Maps SDK Memory Leak? I started chasing what I believed to be a memory leak in the Java maps SDK. I was able to reproduce large memory growth with the the GeometryEngine.union() method (~10MB per second if a union is kept up through a while loop with periodic forced garbage collection). Notice here, my JVM hardly grew, this was only Resident Memory .
 
So I refactored our code base to NOT use union (as it made our system unusable in the current state). I then reran our code on a Linux server (both Ubuntu and Alma Linux) and noticed that our memory was still skyrocketing for each calculation (i.e. 10GB Resident Memory (the JVM heap has never exceeded 2Gb for our calculation).
 
I then started investigating and found that each query added to resident memory quite a lot.
 
Here is a synopsis of my code to query:
 

 

 

 

//set all the QueryParameter items up here

List<Feature> finalFeatures = new LinkedList<Feature>(); 

ListenableFuture<FeatureQueryResult> future = featureTableToQuery.queryFeaturesAsync(query, queryField);

try {
	FeatureQueryResults features = future.get(); 
	for (Feature feature : features) {
		finalFeatures.add(feature);
	}

} catch (Exception e) {
	e.printStackTrace(); 
}
return finalFeatures;

 

 

 

 
I noticed the jump of memory for each subsequent "future.get()" call, and tried to isolate the problem further. Is it really a querying problem, or is there some memory issues with features?
 
So I reran some tests, one of which included JUST testing loading features. To isolate from querying, I wrote a bunch of features I got back from a query (160,000 features, polygon based) to a json file via creating a FeatureCollection and calling .toJson() for it.
 
Then I would read in the json file to get a FeatureCollection, iterate through all the features, and add it to a list.
Like this:

 

 

 

File jsonFile = new File(filepath);
		if (jsonFile.exists()) {
			Path jsonPath = Path.of(filepath);
			String json = Files.readString(jsonPath);
			FeatureCollection collect = FeatureCollection.fromJson(json);
			List<FeatureCollectionTable> featureTables = collect.getTables();
			List<Feature> allFeatures = new LinkedList<Feature>();
			for (FeatureCollectionTable ft : featureTables) {
				Iterator<Feature> features = ft.iterator();
				while (features.hasNext()) {
					allFeatures.add(features.next());
				}
			}

			System.out.println(allFeatures.size() + " number of features");
		}

 

 

 

 
 
NOTE: the json file size for 160,000 features is only 150MB.
 
When I run the code above on a windows machine, my memory spikes up from a couple hundred MB up to 7GB! If I force garbage collection in Windows, it then clears this memory down to about 400 MB after exiting the method. If I run it on Linux, we see a similar growth of resident memory.
 
My question is, why is there so much memory usage for loading features from a relatively small file (NOTE: I see the same issue happening when I query and iterate through the features as well)? If I run a  a couple threads using this code of featureCollection iterations going in parallel, I can easily hit out of memory. What is taking up 7GB of memory for just loading a set of features?  Am I missing something here?
 
0 Kudos
4 Replies
MarkBaird
Esri Regular Contributor

@JaredFaulk we are continuing to look at this issue for you.  Work on your support issue has not stopped.

0 Kudos
JaredFaulk
Emerging Contributor

Thanks Mark. I recieved an email last Friday saying support has closed my case for the Union issue. Though the memory seemed high with Union, support was not able to reproduce a crash with it. Since removing Union and still receiving OOM issues, there is something else going on. I will start a new case with Support, if you think that makes sense.  

0 Kudos
MarkBaird
Esri Regular Contributor

Yes do open a new case with support; they know the situation.  In the meantime as I say, we're not ignoring this, its an ongoing investigation.

0 Kudos
JaredFaulk
Emerging Contributor

Will do. Glad it is being investigated. Thanks for the response Mark.

0 Kudos