Script runs fine in ArcMAP, get out of memory error in cmd

4855
5
03-05-2012 09:48 AM
brianrathburn
New Contributor
I have been working on a arcmap gp model for doing a few CF solves and intersection queries. The model runs great in arcmap, when I export it to python and run it from the CMD prompt I get an out of memory exception. (See Below)

"arcpy.Solve_na(Closest_Facility_2__2_, "SKIP", "TERMINATE")
  File "C:\Program Files (x86)\ArcGIS\Desktop10.0\arcpy\arcpy\na.py", line 1378,
in Solve
    raise e
arcgisscripting.ExecuteError: ERROR 030024: Solve returned a failure.
Out of memory.
Failed to execute (Solve)."

Watching it run from task manager it will chew up about 1GB of ram and die, watching the same thing run in arcmap it will chew up about 1GB of ram and succeed.


I have googled the error code and briefly searched the forums without much luck. I have seen some people mentioning an option for running in process but I'm not really sure what that means.

Any help would be appreciated.

On a side note, the system doesn't actually run out of memory. The development machine has 8GB total and always fails before 5GB total is reached.

Thanks
Brian
Tags (2)
0 Kudos
5 Replies
MathewCoyle
Frequent Contributor
Just an aside, ArcMap is a 32-bit process, so it will never use more than 4GB of memory with large address enabled on a 64-bit OS. I would assume you are using a 64-bit OS based on your total memory limit. The default I think is 2GB out of the box.
0 Kudos
brianrathburn
New Contributor
Thanks for the quick reply.

The 5GB I mentioned above was total system memory usage.

I checked a little closer at the actual memory usage of the "python.exe *32" process right before it crashed and it maxed out between 1.6GB-1.7GB


Ran it again in ArcMAP and completing successfully it used a maximum of 1.78GB, when run in arcmap though it uses the "ArcMAP.exe *32" process.


So is there any way to run the python script in the ArcMap process from the CMD line?



-Brian
0 Kudos
brianrathburn
New Contributor
Found a possible solution here

http://gisgeek.blogspot.com/2012/01/set-32bit-executable-largeaddressaware.html

Turns out the python that is included with the 10.0 release was not compiled to be Large Address Aware. So even though it is running as a 32bit process it can only access a maximum of 2gb of ram.

Tried the fix above locally and it doesn't seem to make a difference for me, I will try it on a machine with more memory and see how that goes.



-Brian
0 Kudos
SebastianKrings
Occasional Contributor
Hello,

we have a similar problem and a little search shows that we are not alone.

Very similar problem discribed here:
http://gis.stackexchange.com/questions/21376/network-analyst-arcpys-solve-running-out-of-memory

ESRI Ticket:
http://support.esri.com/en/knowledgebase/techarticles/detail/30759

Another example:
http://stackoverflow.com/questions/7836672/python-how-to-build-and-access-a-large-collection-of-data...

Our Case:
When running the script in ArcMap all works fine with ArcCatalog using about max. 700MB.
When Exporting the script it crashes (one time of 20 it worked well too) at <400MB when the solve (closest facility, slightly over 1Mio edges) processes.
But theres a big difference to all the others:
We have the logic still stored as Toolbox/Modell in a GDB and not exported itself.
But we have another Modell which still runs this modell (with the logic). We do this to have an central access to further modells and define a row. But for testing we only use this single modell.
This way it crashes as mentioned above.
When we export the script directly and run it, it also works fine without memory failures.
Another difference is that we do not have that much data, we are slightly exeeding the edge-limit mentioned in the ESRI Ticket of 1Mio.
But when expanding the solution task (in my case by running "arcpy.AddLocations_na(...)" three times the process stops with the memory fail.
I could imagin the following:
- ArcCatalog has a better resource planning or any configurations to use memory better.
- ArcPy didnt has and fails on big data amount
- When (like us) running the modell thorugh a parent modell something like both models save their own memoryspace or are climbing together faster to the memory-peak or something other like that
But why?

Here the code of the parent-model:

# Import arcpy module
import arcpy

# Check out any necessary licenses
arcpy.CheckOutExtension("Network")

# Load required toolboxes
arcpy.ImportToolbox("D:/data/esri/GDB/database.gdb/toolbox")


# Local variables:

# Process: model
arcpy.model()



One primary thing the ESRI ticket is not concerning is the fact that solving alle the problems wihtin ArcCatalog (by Model-Builder/ GUI) are working fine!!!
The out of memory problems only occurs when using ArcPy by running the PythonScribt from command-line. This is neccessary to run the processes in background/ automatically without any GUI.
The main question is why is ArcCatalog working while python isnt? Wheres the difference?

Thanks for any ideas.
0 Kudos
JanBorovanský
Esri Contributor
Hi all,

we also had an issue with standalone python script. If we ran it from toolbox in ArcCatalog, script finished correctly in about 1 hour (as expected). If we ran it from CMD, it failed each time with following error meesage: 'System.OutOfMemoryException'. After further investigation we found that app like ArcCatalog can handle available memory much better than standlone python. Also, our script (running as 32bit app) requires more memory than 32bit is allowed to take. We came to following solution: you have to change python.exe or pythonw.exe to Large Address Aware with the EDITBIT.exe (http://msdn.microsoft.com/en-us/library/xd3shwhf%28v=vs.100%29.aspx). We did the change in Visual Studio and the standalone script works fine now in CMD. Our tests were done using 10.1 SP1 (Python 2.7).

Hope this helps,
Jan
0 Kudos