HCarlos

Massive OD Cost Matrix

Discussion created by HCarlos on Feb 1, 2013
Latest reply on Oct 1, 2015 by cdspatial
Hi,
I need to run a 900,000 x 240 OD Cost Matrix (not by choice!) and I'm looking for the most efficient way of getting this done. I'm using the StreetMap dataset, my destinations are spread throughout North Carolina and most of my origins are in NC, but not all are.  I do have a limit of 180 min of travel time.

I recently did a 300,000 x 240 matrix by running subsets of 50,000 x 240.  Each subset took about 2 weeks to process on a desktop dedicated to this (64 bit, 8GB RAM, windows 7, ArcGIS v10.1).  The good news is I didn't run into any out of memory errors, the bad news is we could have had a power outage minutes before completion and I would have lost 2 weeks.  Also, I did this manually and it was a bit of a hassle to keep track of the different subsets (I had several false starts and stops and I have 3 different desktops dedicated to this, which are living in different offices)

So, the question is, how best to tackle the 900,000 x 240 matrix (minimizing my time as well as computer time):
1) use the 50,000 origin subset concept since it worked before
2) automate a loop that runs much smaller subsets, say 1,000x240 - but this leaves me with lots and lots of files to merge together.
3) instead of subsetting the origins, subset the destinations (i.e. 900,000 x 10 and then run this 24 times)
4) other reasonable options?  (Not doing it isn't an option).

Thanks for any ideas,
Heather

Outcomes