POST
|
Thank you ! Yes , I tried the simpler path, it still got that error.
... View more
01-20-2016
09:09 AM
|
0
|
0
|
4224
|
POST
|
Thank you for help! Do I need to add this code before my code directly? I am a beginner, so it looks so complex for me, but I will try this! Thanks!
... View more
01-20-2016
09:08 AM
|
0
|
1
|
663
|
POST
|
Thanks for help. But still got the problem. I'm wondering if it is not the multiprocess problem. I described my question more detail here,Out of memory error I'm really appreciate if I could get your help.
... View more
01-20-2016
09:06 AM
|
0
|
0
|
663
|
POST
|
Hi there, I posted my problem few days before, but I think it may not be the problem of multiprocess, because when I use uni-process, I got the same error. I described my question more detail, hope someone could help with it. So I was going to use the clip function in arcpy. And the following is my code: import arcpy, os, gc
from arcpy import env
env.workspace = r"F:/GCM_CMIP5/FGOALS-g2(LASG-CESS)/mrro/ShpFile/test/"
shpfilepath="F:/GCM_CMIP5/FGOALS-g2(LASG-CESS)/mrro/ShpFile/test/"
def data_clip():
for f in os.listdir(shpfilepath):
if f[(f.find('.')+1):]=='shp':
in_features=f
clip_features = "D:/Research/boundry.shp"
out_feature_class = "F:/GCM_CMIP5/FGOALS-g2(LASG-CESS)/mrro/test/"+f[:f.find('.')]+"_china"+".shp"
xy_tolerance = ""
arcpy.Clip_analysis(in_features, clip_features, out_feature_class, xy_tolerance)
#del in_features, clip_features, out_feature_class, xy_tolerance
#gc.collect()
print('Clip finish!')
data_clip()
I was using the boundry shp file to clip the original shp file. I have lots of original shp file to clip, like maybe about 10,000 shp files (maybe more later). This code worked well at first, but the problem is it would show an out of memory error after maybe about 1,000 files were cliped. And I attached the details of the error. My laptop has 16G memory, but when I got the error, there still has about 10G memory I could use. I could separate these files into several file folders, and it could work well. However, in the future, I may need to deal with much more files, and the manual separation is not the way. Do you have any idea about how to solve it? Thanks a lot! And hope you all have a good day! Best, Hua
... View more
01-20-2016
09:04 AM
|
0
|
2
|
2191
|
POST
|
Thanks! I'll check that. But when I didn't use multiprocessing, it comes out the same error.
... View more
01-10-2016
02:18 PM
|
0
|
3
|
4224
|
POST
|
I want to use the clip function in python, the code is as follow: import os from multiprocessing import Pool import arcpy from arcpy import env def data_clip(shp): env.workspace = r"F:/GCM_CMIP5/FGOALS-g2(LASG-CESS)/mrro/ShpFile/f" clip_features = "D:/Research/boundry.shp" pre, suf = os.path.splitext(shp) new = pre + "_china" + suf in_mem = os.path.join("in_memory", pre) out_feature_class = os.path.join("F:/GCM_CMIP5/FGOALS-g2(LASG-CESS)/mrro/ChinaShpFile", new) arcpy.Clip_analysis(shp, clip_features, in_mem) arcpy.CopyFeatures_management(in_mem, out_feature_class) arcpy.DeleteFeatures_management(in_mem) if __name__ == "__main__": env.overwriteOutput = True env.workspace = r"F:/GCM_CMIP5/FGOALS-g2(LASG-CESS)/mrro/ShpFile/f" p = Pool(7) fcs = arcpy.ListFeatureClasses() p.map(data_clip, fcs) print('Clip finish!') But I got the "out of memory" error. If anyone knows how to solve this problem? Thank you so much!!!
... View more
01-10-2016
02:04 PM
|
0
|
14
|
9845
|
Online Status |
Offline
|
Date Last Visited |
11-11-2020
02:24 AM
|