I have 7000 orders that will be serviced by 8 routes. This gives me roughly 875 orders per route.
The actual running of this VRP takes FOREVER!!. (running for numerous hours and still stuck at 33% loading orders)
I set each route to accept a limit of 875 routes, well below the max of 1000.
My desktop has 10.3.1 with the 64 bit Background GP capability installed and enabled.
The PC has 16 GB of RAM that the 64 bit Background GP can access and an Intel Xeon dual processor that manages 24 cores (virtually).
Data and MXD are local so no network latency for resources.
In other words I am throwing the kitchen sink at my solver and it still is taking FOREVER!!!
Any one have some solid advice for me besides "cut up the orders into smaller sets and run them independently" because that does not generate "true" routes.
Thanks in advance for any insights!!
open up your resource monitor, do a run and see how much memory is being used for a set of 875.
part of the issue may boil down to the complexity of the network you have and how far the routes are (in terms of edges) that the traversal occurs over. Obviously, shut everything else down that you can or normally would.
I spent the morning running ground up tests starting with a single route with 70 orders.
I'll cut to the chase.
A VRP with 700 orders on 8 routes with a limit of 90 orders per route will complete in a couple minutes.
Basically I gave it a very small balanced problem. Orders are very tightly packed to minimize network thrashing.
But the same model where you adjust the order limit on the same routes to 875 will run forever. (I killed it off after 15 minutes - stuck on "Inserting orders 33%")
I got 1400 orders to run on 8 routes allowing 175 orders per route in 15 minutes. Pretty decent.
This shows that it isn't necessarily the orders, per se, that drive up time, but the function that growing orders requires you to grow route order limits which drives up time.
When the orders go up, which forces you to increase the order limit, regardless of number of routes, the time grows exponentially. When any one route gets too large things break down. The Network Analyst route order limit is 1000 orders per route. I was never able to get anything to complete with a limit of even 875.
Simply put 7000 orders on 8 routes is too big for the solver to handle.
One final note: I have it running a single route with 900 orders and a order limit of 900 and it has been running for over an hour. (the operative info. is that the order limit is high on the single route which makes it run poorly.)
If anyone can see a flaw in what I am doing PLEASE let me know.
that is why I was wondering about resources memory-wise. Current arcmap is going to be limited in the memory it will use. PRO isn't, unfortunately, your problem is on the list of 'not available yet' Tools that are not available in ArcGIS Pro—Appendices | ArcGIS for Desktop
If you monitor your system resources and see what is being used. Alternately, can it be run in background processing
But I have seen everything from anti-virus to lack of indices blamed.
Jay Sandhu or someone else may be able to provide next step advice since tiling into smaller blocks doesn't seem to be a preferred or possible solution.
I will try to figure out how to get this to Jay.
Still not quite sure how GEONET works.
No need to get to me! I am here!!
Can I ask the use case for so many orders per route? That is, we designed the VRP to optimize deliveries per day. And normally one truck or route can do about a 100 orders a day max. And as you noticed, that works quite well. VRP (and TSP) is a combinatorial problem, so it will take longer and longer to solve for more orders per route. And with 875 orders per route it will take a while! Also note that when you solve for 7000 orders in one instance, it has to generate a complete OD cost matrix (with 49 million entries) and that takes cpu resources to compute and memory to keep that around while the VRP solver is clustering and optimally sequencing (TSP) the orders per route.
So let me ask again what is the use case before I try to think of some alternative ways to tackle the problem.
Thanks for the quick reply!
The use case is currently 8 trucks (routes) can pick up 7000 garbage collections in a day.
The area of the pickups has a hard outline.
We are trying to optimize routes. The vehicle is a truck with the arm picking up the can that is curbside so it is very fast.
People in our streets division say each "order" takes 15 seconds on average. With time constraints it works out to a 6:15 hour rolling day and roughly 2.3 pickups per minute.
Sounds like they are flying to me but we have never used analytics to look at the actual routes and their efficiencies.
I suspected that you were doing garbage truck routing. The VRP solver is currently not designed to do such high density routing. There are other ways to solve this problem where you need to visit each street (such as in postal deliver and snow plow routes) and the solution could be with an arc routing approach, sometimes also called Chinese Postman solvers.
You can see some previous discussions here:
Let me add one suggestion. You have 7000 locations for 8 trucks. Which works out to be 875 per truck. Can all the trash be picked up by one truck without the need to empty it? That is, can you further subdivide the work based on capacity?
The trucks have a 20,000 lb. capacity and each stop generates on average 15 lbs. of effluent (actually its organics/ compostables). 875 x 15 = 13,125 lbs. No prob.
I understand the general concepts to use other solvers/summary geographies to solve the capacity issue. Each introduces its own data maintenance and/or bias issues.
The 3rd party solution is the most likely path but RouteSmart doesn't like to work with locally maintained data so that is an issue.
We'll keep digging.
Thx for the time.
Are your "orders" every household? Can you look at aggregating the orders per street or per side of street? That is, place one order per side of street and sum up the amount to pick up and service time to that order? You can run Add Locations on the orders and then Summarize the values based on the SourceOID field. And you can generate "mid points" to attach these values to with the Feature to Point tool (with the Inside option).