AnsweredAssumed Answered

Do I need to clean up memory or something each time I called merge_management?

Question asked by jesuiselysee on Feb 28, 2018
Latest reply on Mar 1, 2018 by rborchert

I have a load of shapefiles that I need to merge according to its state. So basically I have a loop to call merge_management multiple times. The scripts run fine for the first 20 states or so, and then it stops by itself without generating any error message.

 

If I singularly run the specific stopped state, it seems to merge fine. 

 

Is it because some kind of memory problem or something? Shall I do some cleaning up each time I called merge_management?

 

 

Code is very simple and straighforward:

 

def batch_projwater2wgs84(strparentfolder,outputfgd):


#merge areawater per state first
lst_statefips = City.read_states(cityeqv_constants.STATEINFO_FILE,'FIPS')
#lst_statefips = ['21']
for statefips in lst_statefips:

time_state = time.time()
#areawater has been previously moved to the state folder, merge everything under this folder into one file
#and output this file to parent folder
merge_areawater_by_state(os.path.join(strparentfolder,cityeqv_constants.US_CENSUS_FILEPREF + statefips + US_CENSUS_AREAWATER_SUFFIX),
strparentfolder,statefips)

print statefips + ' finished in ' + str(time.time()-time_state) + ' seconds'

 

 

"""
merge all areawater into one shape for processing
"""
def merge_areawater_by_state(shapefile_folder,outputfolder,statefips):
arcpy.env.workspace = shapefile_folder
lst_shp = []
for shapefile in os.listdir(shapefile_folder):
if shapefile.endswith('.shp'):
lst_shp.append(shapefile)

output_merge_filename = cityeqv_constants.US_CENSUS_FILEPREF + 'merge_' + statefips + US_CENSUS_AREAWATER_SUFFIX
output_mergefile = os.path.join(outputfolder,output_merge_filename + '.shp')
#delete file if pre-exists
delete_shpfile(outputfolder,output_merge_filename)
arcpy.Merge_management(lst_shp,output_mergefile)

Outcomes