I can't seem to get buffer_analysis to buffer the distance I want. I'm writing a Python tool that needs to buffer points to polygons and dissolve them as part of the workflow. From the ArcMap 10.8 UI the buffer_analysis tool works as I expect it to work on the same input data and parameters I'm testing my tool against. The documentation seems to indicate that I can specify '10 meters' for 'buffer_distance_or_field' when calling it from my tool, but the actual buffer distance I get is much, much larger. The input feature class is geographic (WGS84), but this does not seem to be problem when I use the tool from the UI. I get the expected buffer distance in the output. I am passing the 'method' parameter as 'PLANAR' in both cases.
What could I be doing wrong?
TIA,
Alan
Solved! Go to Solution.
Thanks, DavidPike.
It turned out the real problem was my misunderstanding of the expected output from Buffer_analysis(), not an incorrect buffer distance. I wrongly jumped to the conclusion the buffer distance was at fault when the output feature count was one. When I earlier ran Buffer_analysis() manually I did not look closely enough at the output, only looking at it graphically. I thought I was looking at multiple polygons but in fact it was a single polygon with multiple outer rings. So adding a call to MultipartToSinglepart_management() to convert the outer rings to polygons solved my problem.
I believe when you run it from Desktop, the Planar argument is taken from the CRS of the map/dataframe if not already in a planimetric CRS. When run outside of desktop, it equates decimal degrees to the planar unit of measure and makes your buffer huge. This is not gospel and is only my interpretation from memory/experience however.
your best bet will be to set method='GEODESIC' as I see no reason not to, and that would solve your issue I believe.
otherwise you can of-course project your feature to the proper planimetric CRS and then run the tool with your previous arguments set.
Thanks, DavidPike.
I'm doing this now:
degrees_per_meter = 360.0 / 40075000.0
distance = '{0} DecimalDegrees'.format(float(bin_radius) * degrees_per_meter)
arcpy.Buffer_analysis(fc_photo, fc_buffer, distance, dissolve_option='ALL', method='GEODESIC')
However the buffer distance is still much larger than what I would expect based on the points I'm feeding it. The script is an adaptation of ESRI's GeoTaggedPhotosToPoints_management, so the input points are coming from geotagged JPEG files. I'm adding the ability to merge clusters of points into single points with multiple photos attached. I'm running my tool in an ArcMap 10.8 session with a new, completely empty map. Everything works except this problem with the buffer distance.
Forget about the degrees_per_meter, let the software do what it's designed to do.
You just need to change your buffer_distance_or_field argument to 'Meters'
e.g. '10 Meters'
distance = '{0} Meters'.format(float(bin_radius))
arcpy.Buffer_analysis(fc_photo, fc_buffer, distance, dissolve_option='ALL', method='GEODESIC')
Thanks, DavidPike.
It turned out the real problem was my misunderstanding of the expected output from Buffer_analysis(), not an incorrect buffer distance. I wrongly jumped to the conclusion the buffer distance was at fault when the output feature count was one. When I earlier ran Buffer_analysis() manually I did not look closely enough at the output, only looking at it graphically. I thought I was looking at multiple polygons but in fact it was a single polygon with multiple outer rings. So adding a call to MultipartToSinglepart_management() to convert the outer rings to polygons solved my problem.