Select to view content in your preferred language

Better method than APPEND?

3481
5
Jump to solution
03-18-2014 08:56 AM
JimCousins
MVP Regular Contributor
I am processing @ 5,000 small tables (Info tables). They are the result of an intersect and summary statistics for each element of a polygon grid.
I now wish to get all tabular information into a single table. I am using python with an iterator and Append_Management, but it is slow (about 1 hour per 1,000 tables). Is there a better way to accomplish this? All tables have the same schema, only different number of rows(1 to 12).
Regards,
Jim
Tags (2)
0 Kudos
1 Solution

Accepted Solutions
RobertBorchert
Honored Contributor
I think your stuck with what you are going.  Actually an hour to append 1000 tables does not sound bad.  Processing speed is dependent on size of the tables and the number of tables associated with the append.  In this case 5000 tables is certainly a slow down factor, even if they are small.

Data loader is faster because your not in an edit session.  However, setting it up would be cumbersom.

I am processing @ 5,000 small tables (Info tables). They are the result of an intersect and summary statistics for each element of a polygon grid.
I now wish to get all tabular information into a single table. I am using python with an iterator and Append_Management, but it is slow (about 1 hour per 1,000 tables). Is there a better way to accomplish this? All tables have the same schema, only different number of rows(1 to 12).
Regards,
Jim

View solution in original post

0 Kudos
5 Replies
RobertBorchert
Honored Contributor
I think your stuck with what you are going.  Actually an hour to append 1000 tables does not sound bad.  Processing speed is dependent on size of the tables and the number of tables associated with the append.  In this case 5000 tables is certainly a slow down factor, even if they are small.

Data loader is faster because your not in an edit session.  However, setting it up would be cumbersom.

I am processing @ 5,000 small tables (Info tables). They are the result of an intersect and summary statistics for each element of a polygon grid.
I now wish to get all tabular information into a single table. I am using python with an iterator and Append_Management, but it is slow (about 1 hour per 1,000 tables). Is there a better way to accomplish this? All tables have the same schema, only different number of rows(1 to 12).
Regards,
Jim
0 Kudos
RichardTruong
Occasional Contributor

Hi Robert,

I tried Data Loader for my table in ArcServer, and I received an error "Cannot load simple table".   Any suggestions are greatly appreciated.

Richard

0 Kudos
JimCousins
MVP Regular Contributor
Thanks, Robert. It is helpful to know I have not overlooked the obvious.... Again.
Best Regards,
Jim
0 Kudos
MarkBoucher
Honored Contributor
Editing over a network is slow. If you are working over a network, copy the data to your local drive, edit it faster, then copy it back. Just a thought.
RobertBorchert
Honored Contributor

Since writing my last reply I then saw Mark Boucher answer and it reminds me of something I do.


I created a simple batch file to automatically back up the databases I work with on my computer and compress them automatically.  I simply click the icon first thing every morning and it starts the programs I want every morning (email, browser, ArcGIS), compresses my active databases and backs them up to the network.

Editing on a personal computer, as Mark wrote, is significantly faster in our environment then working off a hard drive on a server somewhere

0 Kudos