There was an error trying to process this table.

27665
44
07-18-2013 10:31 AM
JustinTomei
New Contributor II
I have a dataset of about 32000 addresses I have been trying to geocode. Time after time I keep getting the same error message, the one in the title of this thread. I have changed names of columns, tried making it a dbf, tried removing everything but the address information, splitting the data in half to make it smaller. Regardless of what I do i get anywhere from 1%-5% done and I get the error.

Any ideas?

Thanks.
Tags (2)
44 Replies
JoeBorgione
MVP Emeritus
I have a dataset of about 32000 addresses I have been trying to geocode. Time after time I keep getting the same error message, the one in the title of this thread. I have changed names of columns, tried making it a dbf, tried removing everything but the address information, splitting the data in half to make it smaller. Regardless of what I do i get anywhere from 1%-5% done and I get the error.

Any ideas?

Thanks.


As long as you've made all the field names one word with no special characters, you should be good there. (Typically it'll bail out right at the get-go if thats the problem.)

My guess is somewhere you've got a <null> value in on of your address records.  Personally, I'd make sure the address table is within some flavor of geodatabase. That way all your data and results can be in the same place.
That should just about do it....
0 Kudos
AnthonyMosinski
New Contributor
I am as well having this problem I have a data set of over 250,000 address. I have tried to do the entire table but that gets no where.  Thus I am breaking the data out into sets of 10,000 at a time then seeing how many run then deleting them out of the excel file then running what is left of the 10,000 until I have geocoded all 10,000.  I continue to get the error and I would love to know if there is a solution to this error as well as if there is a way I can run a bigger batch or is it easier to just keep breaking them up. 

I have deleted all unnecessary columns in the excel file I only have street address (#s and street name), city, state, zip and a unique ID so I can match everything back up after I merge all the shapefiles. 

Any help with this would be great thank you.
0 Kudos
TOMKONG
Occasional Contributor II
I am as well having this problem I have a data set of over 250,000 address. I have tried to do the entire table but that gets no where.  Thus I am breaking the data out into sets of 10,000 at a time then seeing how many run then deleting them out of the excel file then running what is left of the 10,000 until I have geocoded all 10,000.  I continue to get the error and I would love to know if there is a solution to this error as well as if there is a way I can run a bigger batch or is it easier to just keep breaking them up. 

I have deleted all unnecessary columns in the excel file I only have street address (#s and street name), city, state, zip and a unique ID so I can match everything back up after I merge all the shapefiles. 

Any help with this would be great thank you.


After the ESRI has upgraded the ArcGIS data model from coverage to shapfile to geodatabase, the geoprocessing tools couldn't deal with the large dataset input.
But the old coverage geoprocessing tools can handle the large dataset input, so you need:
1. Installing the ArcInfo workstation in your PC first (then load the coverage tool in, or you can use command line to run)
2. Convert your all input data to coverage format.
3. Run coverage geoprocessing tool.
(Often it happens when you run shapfile or geodatabase geoprocessing tool, it will give a message about lack of memory. Actually it is not the memory issue).

When I deal with large dataset input, I always switch to run coverage geoprocessing model under ArcInfo workstation.
0 Kudos
JoeBorgione
MVP Emeritus
Are you suggesting converting an excel spreadsheet into a coverage? I'm a little confused by that if you are.

I routinely geocode hundreds of thousands of addresses without any bail outs, seizures, or other malfunctions.  I'm sticking with my story; excel is a great spreadsheet.  I use it all the time to bill my clients and pay my taxes.  When it comes to actually making my living, I use data bases.  Back in the day Info ruled as a data base; that was then this is now.
That should just about do it....
0 Kudos
PeterHanmore
New Contributor III
We got this error frequently when building our custom locators. For us, the error is actually triggered by the ArcGIS desktop geocoding tool timing out after exactly 1 minute from the time the batch of addresses was submitted.  If the arcgis geocoding service cannot process all the addresses in the batch (and return a response) within 1 minute, the ArcCatalog / ArcMap geocoding tools will time out.
You can mitigate this issue by decreasing the suggested batch size (at the cost of some performance) so that fewer addresses need to be geocoded within 1 minute.  We also found that for some reason, multiline (addresses broken into street + city fields) geocoding is significantly slower than single line (unparsed) geocoding.  I have not found any way to change the behaviour of ArcCatalog so that it will wait longer than 1 minute before timing out.  If anyone else knows, I'd love to hear how you do it.
0 Kudos
KimOllivier
Occasional Contributor III
People seem to be having some serious problems geocoding a small file of one million records!
You should never have to split a table for geoprocessing since it is a serial process. Unless you are trying to use a server geocoder for batch. In this case the app posts all the records across and back again, not really likely to be fast enough before a timeout.

As a benchmark, I expect to be able to geocode addresses at the rate of at least 1M/hour and sometimes up to 6M/hour on my desktop machine. My typical file contains half a million addresses. I have the reverse performance experience with single line versus batch processing, single line slows down to one tenth the speed because zone indexes cannot kick in.

Therefore I hope you can keep looking for the non-obvious error that is causing the problem for you.
My laptop has no trouble geocoding any size file, so it is not the size of the machine.

It is likely that indexes are essential for normal use. Do you have parts of your address that can be indexed to allow the geocoder to break up the search? Such as state, zip or city?

Any geoprocessing across a network drive is a mistake! This includes geocoding. Networks are just not fast enough with all the packing and unpacking to send each request in a reasonable time. Put your geocoder, source and output on the same local machine on a local drive. I understand this will break "corporate policies" nearly everwhere, but do a demo to show the problem to your administrator. You will have to copy-process-replace. This will also apply if you are using a remote database using SDE.

I personally copy any source from a spreadsheet, shapefile or text file to a local filegeodatabase table. That enables indexing and unlimited sizes and makes it easy to put back in an enterprise database. You have complete control over the field types, null values and other unexpected data that will trip up the geocoder. Null values are sure to cause indexing to fail.

It is very hard to set up a custom locator properly, I have yet to make them work properly myself after trying to tweak the defaults. The default timeout in my locator is 100 seconds, but who wants to wait for that for each failure? If it takes even a millisecond it is too long, so I set it for 1 second. If timing out is a problem, then indexing is not kicking in.

If indexing is not running, why not? Either the reference does not have suitable indexes being built or you are throwing address candidates at it that do not have indexable components, eg no state when the locator expects a state.
0 Kudos
KevinPark
New Contributor
I am still having issues. I am trying to geocode five years worth of data with roughly 20K records in each year, working for ArcGIS Desktop 10 Service Pack 5 (ArcInfo license).

I have tried the 10.0 North America Geocode Service and 10.0 US Streets Geocode Service.

When I try to geocode from the csv file I get the message "There was an error trying to process this table" with no further information.
When I try to geocode from after converting to a geodatabase table I get a variety of different error messages, including
"Attempted to read or write protected memory. This is often an indication that other memory is corrupt" or
"No resource could be found at that address"

But I don't understand because I got two years of data to work. There is nothing different about the remaining three.
0 Kudos
JoeBorgione
MVP Emeritus
I am still having issues. I am trying to geocode five years worth of data with roughly 20K records in each year, working for ArcGIS Desktop 10 Service Pack 5 (ArcInfo license).

I have tried the 10.0 North America Geocode Service and 10.0 US Streets Geocode Service.

When I try to geocode from the csv file I get the message "There was an error trying to process this table" with no further information.
When I try to geocode from after converting to a geodatabase table I get a variety of different error messages, including
"Attempted to read or write protected memory. This is often an indication that other memory is corrupt" or
"No resource could be found at that address"

But I don't understand because I got two years of data to work. There is nothing different about the remaining three.


Any chance you can provide one of the csv's that bail out?
That should just about do it....
0 Kudos
KevinPark
New Contributor
Any chance you can provide one of the csv's that bail out?


Afraid not because of confidentiality issues.

I got another error message though from using the geodatabase table: something about bad syntax, but I was using the dialogue box.
0 Kudos