@I'm trying to assign my Watersheds polygon into regions of non overlapping polygons.
I'm currently using the ESRI Spatial Analyst Supplemental Tools
New Spatial Analyst Supplemental tools, v1.3 | ArcGIS Blog
The Zonal Statistics Tool 2 implements the grouping of the overlapping polygons to generate non overlapping polygons to run the Zonal Statistics, but I'm unable to following the logic that that's being used as I'm still new to Python.
I've also found the following link on the Stack Overflow website and download the sample code:
I've made amendments to the source code from the Stack Overflow website:
import os import arcpy from arcpy import GetParameterAsText fc = GetParameterAsText(0) idName = GetParameterAsText(1) dirname = os.path.dirname(arcpy.Describe(fc).catalogPath) desc = arcpy.Describe(dirname) if hasattr(desc, "datasetType") and desc.datasetType=='FeatureClass': dirname = os.path.dirname(dirname) arcpy.env.workspace = dirname def countOverlaps(fc,idName): intersect = arcpy.Intersect_analysis(fc,'intersect') findID = arcpy.FindIdentical_management(intersect,"explFindID","Shape") arcpy.MakeFeatureLayer_management(intersect,"intlyr") arcpy.AddJoin_management("intlyr",arcpy.Describe("intlyr").OIDfieldName,findID,"IN_FID","KEEP_ALL") segIDs = {} featseqName = "explFindID.FEAT_SEQ" idNewName = "intersect."+idName for row in arcpy.SearchCursor("intlyr"): idVal = row.getValue(idNewName) featseqVal = row.getValue(featseqName) segIDs[featseqVal] = [] for row in arcpy.SearchCursor("intlyr"): idVal = row.getValue(idNewName) featseqVal = row.getValue(featseqName) segIDs[featseqVal].append(idVal) segIDs2 = {} for row in arcpy.SearchCursor("intlyr"): idVal = row.getValue(idNewName) segIDs2[idVal] = [] for x,y in segIDs.iteritems(): for segID in y: segIDs2[segID].extend([k for k in y if k != segID]) for x,y in segIDs2.iteritems(): segIDs2= list(set(y)) arcpy.RemoveJoin_management("intlyr",arcpy.Describe(findID).name) if 'overlaps' not in [k.name for k in arcpy.ListFields(fc)]: arcpy.AddField_management(fc,'overlaps',"TEXT") if 'ovlpCount' not in [k.name for k in arcpy.ListFields(fc)]: arcpy.AddField_management(fc,'ovlpCount',"SHORT") urows = arcpy.UpdateCursor(fc) for urow in urows: idVal = urow.getValue(idName) if segIDs2.get(idVal): urow.overlaps = str(segIDs2[idVal]).strip('[]') urow.ovlpCount = len(segIDs2[idVal]) urows.updateRow(urow) countOverlaps(fc,idName) def explodeOverlaps(fc,idName): countOverlaps(fc,idName) arcpy.AddField_management(fc,'expl',"SHORT") urows = arcpy.UpdateCursor(fc,'"overlaps" IS NULL') for urow in urows: urow.expl = 1 urows.updateRow(urow) i=1 lyr = arcpy.MakeFeatureLayer_management(fc) while int(arcpy.GetCount_management(arcpy.SelectLayerByAttribute_management(lyr,"NEW_SELECTION",'"expl" IS NULL')).getOutput(0)) > 0: ovList=[] urows = arcpy.UpdateCursor(fc,'"expl" IS NULL','','','ovlpCount D') for urow in urows: ovVal = urow.overlaps idVal = urow.getValue(idName) intList = ovVal.replace(' ','').split(',') for x in intList: intList[intList.index(x)] = int(x) if idVal not in ovList: urow.expl = i urows.updateRow(urow) ovList.extend(intList) i+=1 explodeOverlaps(fc,idName)
but I'm receiving the following error message and I'm unable to resolve it. I'd really appreciate assistance in resolving the following as I'm still new to python.
Regards
Peter Wilson
Solved! Go to Solution.
To anyone that might be interested in using the following. Tony Desilva answered my question on the Stack Overflow website. The problem was that my ID's were to long, they started at 10000 and were being inserted into the Overlaps field to indicate the Watersheds that were being overlapped. I reduced my ID's to thousands and the script ran successfully.
Regards
Peter Wilson
To anyone that might be interested in using the following. Tony Desilva answered my question on the Stack Overflow website. The problem was that my ID's were to long, they started at 10000 and were being inserted into the Overlaps field to indicate the Watersheds that were being overlapped. I reduced my ID's to thousands and the script ran successfully.
Regards
Peter Wilson