Inserting large numbers of records using Table.insert cursors causes large memory use

Discussion created by p_d_austin on Aug 8, 2011
I am trying to develop a Java application that will insert 1.3 million rows into a FileGeodatabase using the Java ArcObjects API.

The problem is that the memory usage increases for each row being inserted and I can't figure out how to prevent this. Here are steps that I have taken and observations made.

  • All interaction with ArcObjects classes are done in a single thread

  • A single IRowBuffer (or IFeatureBuffer) is re-used for all inserts

  • A single IGeometry (Polyline) object is used for all inserts, the points are set for each input geometry from _WKSPoint or _WKSPointZ instances

  • A single GeometryEnvironment instance is used to set the points on the geometry

  • Load only mode is set on the FeatureClass (tried with and without)

  • I have tried both false and true for the buffering flag on the ITable.insert method but it doesn't seem to make a difference.

  • I flush every 1000 inserts (about 1 second)

  • I tried closing the cursor every 1000 inserts, releasing it and creating a new one but it didn't change the memory behaviour.

  • The Java heap memory stays low throughout the run, so it's somewhere in the COM world that the memory usage stays high.

  • My code basically (although it's more complicated) follows the same approach as in the documentation http://help.arcgis.com/en/sdk/10.0/java_ao_adf/conceptualhelp/engine/index.html#/How_to_use_cursors_in_the_geodatabase/0001000003s5000000/