grant.zimmerman1

Python Scripts in ArcMap 10.6 can't seem to use much more than 10MB before a MemoryError

Discussion created by grant.zimmerman1 on Apr 30, 2018
Latest reply on Apr 30, 2018 by Dan_Patterson

I've been encountering Memory Errors using ArcMap 10.6 Advanced when using script tools that do a number of things related to Numpy and Pandas including converting a table or fc to a NumPy array using arcpy.Table(or FC)ToNumPyArray, or when converting from a NumPy array to a Pandas Dataframe, doing things in/with said Dataframe and then sometimes when I'm converting back to a NumPy array using arcpy.NumPyArrayToTable.  In other words, I'm experiencing MemoryErrors at any step in this workflow (GDB Table -> NumPy Array -> Pandas DataFrame -> NumPy Array -> GDB Table).  As such, I don't think it's a problem with any script or step in a script, I think it has to do with memory issues.  

 

Which makes sense, except that the size of the NumPy Array or Pandas DataFrame never seems to be higher than about 12MBytes (using 'getsizeof' from 'sys' to test the size of the Array or DataFrame) before I get a MemoryError.  I have 16GB of memory on 64bit Window 10 and never have less RAM available than the 3-4GB that ArcMap is capable of using as a 32bit application.  I have confirmed this using psutil.virtual_memory() from the ArcMap Python Console.  I even went through the process of making the Python 2.7 installation LargeAddressAware, no help.  I can run the script tool in or out of process, no difference, still get memory errors if the array or dataframe is too far over 10MB.  While I can limit the size of the input data to get any one of the many script tools with which I'm experiencing MemoryErrors to work, I would love to be able to use my script tools on sizable pieces of data.

 

What DOES work is running a script from IDLE completely outside of ArcMap (of course still importing arcpy).  I can run any one of my scripts on rather large datasets (for example: 100,000 obs, 600+ type:double fields) and python.exe will use as much RAM as it needs, happily climbing over 2GB.  Try to do the same thing from within ArcMap and it will throw a MemoryError at a 500MB baseline without so much as a bump in RAM usage from the script.

 

So am I missing something? 

-Is a Python script run from ArcMap not supposed to use more than about 12MB? 

-Is there a setting somewhere that will allow me to increase the amount of memory Python can use in ArcMap?

-Is there something I can do in my scripts to preemptively expand the amount of memory available to Python inside of ArcMap?

-Did ESRI ludicrously decide to cap the amount of memory Python can use on an EVA (student) License? 

-Have I missed something simple? (To be fair, I'm likely just a step above novice when it comes to Python)

-Has none of this made any sense and I've lost my mind from months of staring at ArcMap?

 

Any help/suggestions would be appreciated.

 

 

Note: I would be happy to post some code, but as I'm seeing this problem with many different scripts and tools that use Numpy and Pandas, I'm not sure fixating on any one script will be useful.

Outcomes