10.2 Crash When Geocoding

559
5
08-22-2013 09:52 AM
PatrickDavis
New Contributor
Have others experienced crashing when geocoding a large number of records in 10.2?  I work for a large school district and have a process set-up to geocode our student database (~85k) on a nightly basis, but since I have updated to 10.2 ArcMap crashes every time.  I had similar issues in 10.1 with a composite locator, but was able to geocode with a single duel range locator.

This is very frustrating and i'm curious if others have experienced this or if anyone has a workaround? 

Thanks,
Patrick Davis
Tags (3)
0 Kudos
5 Replies
KimOllivier
Occasional Contributor III
There is a service pack out for geocoding from a server for 10.2, but it does not sound like the same issue.
It crashes for me, but since I am using a custom locator I expect it to crash.

Maybe change the settings in the locator properties to a bigger cache?
You can also apply filters to remove empty records so that they are not even attempted.
0 Kudos
PatrickDavis
New Contributor
Thanks, Kim.

You're correct, the patch is for a different issue, but I tried it anyways, and still no luck.

I was able to geocode a much smaller file with about 4k records.  The file I need to geocode is 85k records and is about 10mb.  I have removed as many fields as I could. I changed the Data Cache to 2048 mb, which is the max.

Any other thoughts?
0 Kudos
PatrickDavis
New Contributor
I don't know if others have come across this issue, but I may have found a workaround.  I was originally using a .txt address file to geocode and it would crash.  I was able to get the file geocoded properly after I imported the .txt file into Excel and saved it as an .xls file. 

No idea why this worked, but it did some reason.

Patrick
0 Kudos
PatriciaApt
New Contributor
I'm still working in 10.0 and I've always had this problem. I too regularly geocode 90,000 students from excel files. It turns out what was happening was the goecoding process reads each field in the excel file as 250 characters wide. This was creating huge files that most often would crash partway through.
The solution that worked for me was to convert the excel file to an info table, defining each field with appropriate field widths.
Then geocode the info table.

More specifically;
open arcToolbox,
select Conversion Tools,
select To Geodatabase,
select Table to Table.
When the wizard opens, under Field Map, right click on each field and define the properties. With large files this process can take quite awhile.

Hope this helps.
Patricia
0 Kudos
KimOllivier
Occasional Contributor III
Loading Excel files to a database is still a major problem for any geoprocessing because there is no schema in Excel. The loader guesses the types (string or floating point) from the first 3 lines, setting them to ridiculous 255 characters wide. No integer fields for keys either.

The new Excel loader geoprocessing tool at 10.2 is just as bad as the interactive method because you cannot specify a schema. What were they thinking?

One solution is to export the spreadsheet to a CSV file and use the Microsoft schema.ini to define a schema. See the help for the format and parameters. This works really well - you can rename the fields to valid names, define integers and smaller string fields and even dates are possible. It is also very fast.

If you have a table in a geodatabase with the same schema you can generate the schema.ini by exporting a few lines as a CSV file and then edit the file to the name of the input file.

An alternative way to load Excel files with a proper schema is to use FME or the Data Interoperability Extension as Esri calls it.
0 Kudos