Select to view content in your preferred language

What is the fastest way to iterate through Street data to fix Address Range issues?

1104
11
08-25-2017 06:55 AM
ColinLang1
New Contributor III

I have 5 types of issues with address ranges, due to irregularities in how the local communities assigned civic addresses.  These include odds and evens on the same side of the street, the "high" end of the range actually being lower than the "low end of the range (due to digitizing direction, in relation to traffic flow direction), or all of the addresses on both sides being odd or being even in the case of a cul-de-sac.  For some of these cases, I need to find the connected segments of the road that have the same street name, zero their ranges, and put the entire range onto one segment to solve the issue.


I need to export my street data for use by others, and their software doesn't tolerate any of these oddities, so we have to massage the data to make it work, by adjusting the ranges so that they comply with the rules:  "odds on one side, evens on the other" and "low to high" in the digitizing direction.

We have a tool written by others in .NET/ArcObjects which can process the entire dataset (>60000 roads) in about a minute or two.   (There are only about 110 roads with issues). But the existing tool has deficiencies that we want to correct and we don't have access to the original source code.  The fastest I've been able to manage is about 30 minutes to loop through the data in python using selection sets and cursors.

I'm trying to figure out how the author of the other tool did it so much faster.  Any ideas?  I'm wondering if I would have better luck loading all of the attribute data into python dictionaries and processing it in memory and writing all the changes back at the end, rather than using a cursor to loop through the data directly.

I don't want to influence replies by posting my code - I want to abandon my method and find some alternate method that is significantly faster.  So can anyone think of a way to do it in only 2 minutes?

0 Kudos
11 Replies
ColinLang1
New Contributor III

1. I am using arcpy.da cursors

2. My first step exports the street file from the geodatabase on the server to a local copy.  Right now that is a shapefile because that is what is required for the final output, but I could have an intermediary step of a file geodatabase if that is faster overall

3. As per #2, it is a local copy

4.  I import arcpy, os, datetime and from collections, OrderedDict.  I am using all of these.

5.  I still have further steps to develop to complete this tool, and the overall process could end up being much longer.  I also have co-workers who use the tools I develop, who complain when my tools aren't as fast as those from other sources, so I want to learn methods that I can use over and over again for faster processing.

0 Kudos
MicahBabinski
Regular Contributor

The in_memory workspace may give you a performance boost over a local file geodatabase/shapefile cause there's no disk I/O. You also might want to try timing the different steps of your process to see where the bottleneck is.

0 Kudos