Select to view content in your preferred language

Insert Fails at 1000 records

2830
5
06-11-2010 10:34 AM
MichaelFischer
Emerging Contributor
I'm trying to insert a number of points into an ArcSDE 9.3.1 gdb, in an edit session

I've found that at 1000 features, I get an exception  HRESULT: 0x8005018B

This only happens on SDE .. a personal GDB is fine.

I've tried flushing the load cursor at 500 record intervals, but no difference.

Any ideas? 

Thanks
0 Kudos
5 Replies
AlexanderGray
Honored Contributor
Right off the bat seems like a problem with SDE or the RDBMS.  Sounds like there is a buffer or temp table filling up or that has a max set on it.   I am not really an SDE or Oracle guru but I would look in that direction.
0 Kudos
KirkKuykendall
Deactivated User
are you doing your inserts between IEditor.StartOperation and IEditor.StopOperation?
0 Kudos
JamesGonsoski
Occasional Contributor
It may or may not have anything to do with your issue, but check this help page on ArcSDE Initialization Parameters. Note the default size of the AUTOCOMMIT value.
0 Kudos
JohnHauck
Frequent Contributor
What type of underlying database? Is SDE also at 9.3.1? Can you post some example code? I tried a quick test with SQL Express and didn't see a problem with inserting 100,000 features.
0 Kudos
RemigijusPankevicius
Emerging Contributor
Must be that Oracle AUTOCOMMIT that is 1000 by default.
http://proceedings.esri.com/library/userconf/proc01/professional/papers/pap869/p869.htm

I've seen examples how to do that bulk inserts without DB tuning, at the cost of leaving transaction.
0 Kudos