Select to view content in your preferred language

memory error with list

2741
3
11-18-2014 12:31 PM
LindseyWood
New Contributor III

I am taking lats and longs from .csv and creating polygons I originally used them to make .shp files until I found about the 2GB limit.

So I thought I would use a .gdb for the feature class but I think I am running into another issue with the

featureList I am creating it has a memory error and stops at around 2 mil I believe.

What would be the best approach to create one large feature in a .gdb through a .csv that may have 3 mil or more.

I thought maybe reset the featureList but then lose the index if I append to the attribute table.

Ultimately I may have a need to join the orginal .csv to the feature class through a common index.

Any ideas appreciated I realize I may also be able to solve by using 10.2 with 64 bit however if I need to move to production I can not guarantee all users will have this and I read you would need to make sure of libraries are running 64 as well.

So for now assume Running arcGis 10.0 32bit

ps sorry about the code I do not see a button to format the code have not been on this new forum.

Thanks!!

featureList = []

coords=([(ullon_val, ullat_val),(urlon_val,urlat_val),

         (lrlon_val,lrlat_val),(lllon_val,lllat_val),(ullon_val, ullat_val)])

    array = arcpy.Array()

              

    for x, y in coords:

        array.add(arcpy.Point(x, y))

        # Add the first point of the array in to close off the polygon

        array.add(array.getObject(0))

        print array.getObject(0)

        # Create a Polygon object based on the array of points

        polygon = arcpy.Polygon(array)              

        # Clear the array for future use

        array.removeAll()

              

        # Append to the list of Polygon objects

        featureList.append(polygon)

        print sys.getsizeof(featureList)

        ##2097152  bytes in 2 megabytes

    try:

        arcpy.CopyFeatures_management(featureList,"D:/Footprints/process/Fgdb.gdb/test_3")

    except:

        # If an error occurred while running a tool print the messages

        print arcpy.GetMessages()

Message was edited by: Curtis Price - see Posting Code blocks in the new GeoNet

0 Kudos
3 Replies
XanderBakker
Esri Esteemed Contributor

If you run into memory problems, you can use the arcpy.da.InsertCursor (or arcpy.InsertCursor for 10.0) to insert each feature...

0 Kudos
curtvprice
MVP Esteemed Contributor

I'm with Xander, glomming this all into a huge object in memory is probably not a good idea even if you do run 64 bit and and have the addressing space.

You're much better off writing the features to disk one by one, or at least buffering (doing a bunch and then inserting a bunch at a time, clearing memory for the next batch) instead of trying to stuff zillions of features into memory....

Python normally does this kind of buffering for efficiency when working with much much smaller text files....

0 Kudos
JamesCrandall
MVP Frequent Contributor

Do you have to rebuild the entire layer each time?

Why not keep a "current" version and just update that with new incoming information?  It's pretty straight forward to compare two .csv's, extract the differences and use these to update the "current' layer.