I've been encountering Memory Errors using ArcMap 10.6 Advanced when using script tools that do a number of things related to Numpy and Pandas including converting a table or fc to a NumPy array using arcpy.Table(or FC)ToNumPyArray, or when converting from a NumPy array to a Pandas Dataframe, doing things in/with said Dataframe and then sometimes when I'm converting back to a NumPy array using arcpy.NumPyArrayToTable. In other words, I'm experiencing MemoryErrors at any step in this workflow (GDB Table -> NumPy Array -> Pandas DataFrame -> NumPy Array -> GDB Table). As such, I don't think it's a problem with any script or step in a script, I think it has to do with memory issues.
Which makes sense, except that the size of the NumPy Array or Pandas DataFrame never seems to be higher than about 12MBytes (using 'getsizeof' from 'sys' to test the size of the Array or DataFrame) before I get a MemoryError. I have 16GB of memory on 64bit Window 10 and never have less RAM available than the 3-4GB that ArcMap is capable of using as a 32bit application. I have confirmed this using psutil.virtual_memory() from the ArcMap Python Console. I even went through the process of making the Python 2.7 installation LargeAddressAware, no help. I can run the script tool in or out of process, no difference, still get memory errors if the array or dataframe is too far over 10MB. While I can limit the size of the input data to get any one of the many script tools with which I'm experiencing MemoryErrors to work, I would love to be able to use my script tools on sizable pieces of data.
What DOES work is running a script from IDLE completely outside of ArcMap (of course still importing arcpy). I can run any one of my scripts on rather large datasets (for example: 100,000 obs, 600+ type:double fields) and python.exe will use as much RAM as it needs, happily climbing over 2GB. Try to do the same thing from within ArcMap and it will throw a MemoryError at a 500MB baseline without so much as a bump in RAM usage from the script.
So am I missing something?
-Is a Python script run from ArcMap not supposed to use more than about 12MB?
-Is there a setting somewhere that will allow me to increase the amount of memory Python can use in ArcMap?
-Is there something I can do in my scripts to preemptively expand the amount of memory available to Python inside of ArcMap?
-Did ESRI ludicrously decide to cap the amount of memory Python can use on an EVA (student) License?
-Have I missed something simple? (To be fair, I'm likely just a step above novice when it comes to Python)
-Has none of this made any sense and I've lost my mind from months of staring at ArcMap?
Any help/suggestions would be appreciated.
Note: I would be happy to post some code, but as I'm seeing this problem with many different scripts and tools that use Numpy and Pandas, I'm not sure fixating on any one script will be useful.
Move to ArcGIS Pro, your scripts will most likely work with little need for upgrading (they even have a 2to3 script to help with the transition). Pro obviates the need for background geoprocessing (needed for some arcpy tools).
You will have no memory problems given your configuration within or outside of the mapping environment.
I can't. I have an EVA license which doesn't work with Pro. I'm assuming it's because my school hasn't decided to support Pro yet, although the fact that ESRI hasn't just allowed all EVA licenses to work with Pro is absolutely insane. Especially considering EVA licenses are often used on personal computers and whether or not it's supported by a school is a non issue as support from the school almost always nonexistent for a lowly grad student like myself. So I'm stuck doing my research on 10.6.
Our students have access to Pro because we said we wanted it. They (aka IT or your administrator) should be able to issue you access to it. You may be on your own for support, but for grad work you should have access to it. Ask your advisor to issue the request. IT is there to support student work not impede it, I am sure your advisor will agree.
Besides, the 'arcgis' module is nice, Jupyter notebooks are useful, Spyder is a great python IDE and the versions of NumPy, SciPy, Matplotlib and Pandas are more current. Check my geonet blog for python/numpy/etc related stuff