Memory usage after loading terrain

717
3
Jump to solution
06-02-2022 12:12 AM
AdamChełstowski
New Contributor II

I have a problem with unloading terrain placed on scene. Since the terrain I'm working on is pretty huge I decided to only load fragment of it, do some operations and unload it  in order to make space for the next one. The problem is, unloading the terrain seems to not free up memory.

After loading and unloading hundreds of times from script, my memory usage is more than 50 GB and soon enough - CityEngine shows out of memory error. I tried to load and unload manually without the script and the outcome is the same - after doing it 10 times, Windows Task Manager shows the memory not freeing up.

Is this a memory leak? Should I be using ce.delete() or some other method of disposal?

I am working on CityEngine 2021.0.7316. I attached very simple project to replicate the error - open the MemoryTestScene.cej and run simpleMemoryTestScript.py.

0 Kudos
1 Solution

Accepted Solutions
JonasObertuefer
Esri Contributor

Hi @AdamChełstowski,

Thanks a lot for reporting this. It is indeed a merory leak. Until this gets fixed you can try the attached modified version of your simpleMemoryTestScript.py its does not fix the leak but it should be way less bad.

Best
Jonas

View solution in original post

3 Replies
JonasObertuefer
Esri Contributor

Hi @AdamChełstowski,

Thanks a lot for reporting this. It is indeed a merory leak. Until this gets fixed you can try the attached modified version of your simpleMemoryTestScript.py its does not fix the leak but it should be way less bad.

Best
Jonas

EvelynHsu
New Contributor III

Hi Jonas,

 
Thank you for sharing the script. In fact we recently encountered something similar. We are using a python script to iteratively import Multipatch feature classes and execute the CleaupShapes function, and later export them to a fresh fgdb. We have tons of such data to be processed and usually we just let the script run for days inside a CE instance. Usually after a day or two we'll get a "Out of memory" error, and we simply restart the CE instance and repeat the process.
 
I'm wondering if there are some good practices regarding such usage that you could share. I noticed in your modified script that you used temp folders to do the importing and deleted them after that. Would you recommend a similar approach for importing large multipatch datasets?
 
Thanks.
0 Kudos
JonasObertuefer
Esri Contributor

Hi @EvelynHsu, we would like to further investigate this but its most likely not directly related to the original thread so I sent you a DM.

0 Kudos