Select to view content in your preferred language

Virtual Memory

2933
7
10-13-2011 09:33 AM
JaimeMillan
Emerging Contributor
I'm running a loop over Network analysis.
After some work and after increasing the memory of my desktop to 8G I'm being able to run it for a while. But ArcGis still crashing. I suppose it's a problem of virtual memory.

DOES ANYONE KNOW HOW TO INCREASE THE VIRTUAL MEMORY USE BY ArcGis? Like the set mem command in stata?

Best
J
Tags (2)
0 Kudos
7 Replies
MathewCoyle
Honored Contributor
Short answer, no. Long answer, hopefully soon.

http://support.esri.com/en/knowledgebase/techarticles/detail/38343

This is the part I think you are interested in...
ArcGIS Desktop on 64-bit
ArcGIS Desktop 10.0 applications are natively 32-bit applications but take advantage of a technology known as large memory awareness. This means that individual processes, such as ArcMap.exe, may be capable of accessing more than 2GB of memory (up to 4GB) when run on a 64-bit OS. Note that some functionality and third-party libraries that are part of ArcGIS are not compatible with large-address-awareness, and as such, some portions of ArcGIS may not be able to address more than 2GB even when running on a 64-bit OS.
0 Kudos
JaimeMillan
Emerging Contributor
Wow, this is very bad news for me!
Thanks for your answer
J
0 Kudos
StacyRendall1
Frequent Contributor
Are you able to split your origins into multiple files (or something like that) so you are dealing with smaller bunches of data each time?

Network analyst crashes a lot, and for a huge number of reasons; what else can you tell us about your situation or specific error messages...?
0 Kudos
KimOllivier
Honored Contributor
Perhaps you can look at your process. Running "network in a loop" rings alarm bells.

Perhaps you need to remove previous runs in the loop to release memory? If the first pass works, then you must clean up. Network results are held in memory, perhaps write them out to a file if you need to keep them.
0 Kudos
ChristopherStorer
Emerging Contributor
The /3GB switch can also help, but it is an advanced technique, and the benefit will be minimal.
http://technet.microsoft.com/en-us/library/bb124810(EXCHG.65).aspx
0 Kudos
DarylVan_Dyke
Deactivated User
I've found that, due to persistent memory leaks, any kind of serious iterative work with ArcGIS requires a lot of extra effort.  If running the same loop consistently crashes on the same iteration, and increasing the size of input data set makes it crash, consistently, on an 'earlier' iteration, then you probably have a memory leak issue.

In my experience, the only way to handle these is to build the analysis as a wrapper script for the operation in python, and then code an additional driver routine to launch that wrapper.  For first try, look at the 'os.system' method.  Once you've got that down, try the subprocess module. 

Bonus? If you can pay the cost of book-keeping, you can launch multiple instances (loop steps), and keep things moving.  Penalty?   Extra helping of book-keeping.

In my experience: Even with the simplest operations, if you run it 1000 times, you can expect some of them to crash anyway (Error 999999).  So do code some logic to keep track of failed evaluations, and generate a list of what needs to be re-rendered down the line.
0 Kudos
AlexOulton1
Emerging Contributor
Try deleting and recreating the geoprocessor object after each iteration, this should severely improve performance, as the GP gets slower, and slower........and slower.... the more loops it is forced to do.

Also, use multiprocessing pool.map() to split your job over your pc cores 🙂

http://blogs.esri.com/Dev/blogs/geoprocessing/archive/2011/08/29/Multiprocessing.aspx
0 Kudos