AnsweredAssumed Answered

Out of memory error

Question asked by hs1991 on Jan 20, 2016
Latest reply on Jan 21, 2016 by Dan_Patterson

Hi there,

 

I posted my problem few days before, but I think it may not be the problem of multiprocess, because when I use uni-process, I got the same error. I described my question more detail, hope someone could help with it.

 

So I was going to use the clip function in arcpy. And the following is my code:

import arcpy, os, gc
from arcpy import env

env.workspace = r"F:/GCM_CMIP5/FGOALS-g2(LASG-CESS)/mrro/ShpFile/test/"
shpfilepath="F:/GCM_CMIP5/FGOALS-g2(LASG-CESS)/mrro/ShpFile/test/"

def data_clip():
    for f in os.listdir(shpfilepath):
        if f[(f.find('.')+1):]=='shp':
            in_features=f
            clip_features = "D:/Research/boundry.shp"
            out_feature_class = "F:/GCM_CMIP5/FGOALS-g2(LASG-CESS)/mrro/test/"+f[:f.find('.')]+"_china"+".shp"
            xy_tolerance = ""
            arcpy.Clip_analysis(in_features, clip_features, out_feature_class, xy_tolerance)
            #del in_features, clip_features, out_feature_class, xy_tolerance
            #gc.collect()
    print('Clip finish!')
data_clip()


 

I was using the boundry shp file to clip the original shp file. I have lots of original shp file to clip, like maybe about 10,000 shp files (maybe more later). This code worked well at first, but the problem is it would show an out of memory error after maybe about 1,000 files were cliped. And I attached the details of the error. My laptop has 16G memory, but when I got the error, there still has about 10G memory I could use. I could separate these files into several file folders, and it could work well. However, in the future, I may need to deal with much more files, and the manual separation is not the way. Do you have any idea about how to solve it? Thanks a lot!

 

And hope you all have a good day!

 

Best,

Hua

  

Outcomes