I am trying to calculate "fetch" distance between random points and the nearest shorelines in every 22.5 degree direction (e.g., distance at 0 degrees, distance at 22.5 degrees, distance at 45 degrees, etc.) and I am not sure if there is a tool that calculates the distance along a line from the beginning of that line to the first intersection.
I have created lines in each of these directions starting from each random point using the Bearing Distance to Line tool. Total length of each line was set to 100 miles so that it would more than cover the largest shoreline distances in the study area. The random points (500 total) were placed along the shoreline (a polyline feature) in the Breton Sound of Louisiana.
The Near, Split Line, Intersect, and Linear Ref Tools don't seem to work for what I want to do. Any ideas?
I would use Linear Referencing to do this. Each line you have created will need to be a Route (give each a unique Route Name) starting at the random point with a measure of 0 miles and then ending with a measure of 100 miles (be sure the M resolution and M tolerance are set to the level that gives you the precision you want). Use the Locate Features along a Route tool with your shoreline polygon as the features to be located. This will create one or more line segment events on each route. Run the Summary Statistics tool to find each Route Name as the unique case field and the minimum From measure of any line events created. The output will be the closest point on the line to the random origin point. Convert the table to a Linear Referencing Point Event Layer and you can confirm that all the points fall at the intersection of the line and the closest shoreline.
My preference is always to script it out with arcpy, especially in cases like this where using the GUI could quickly become cumbersome.
ArcGIS 10.3 introduces a new method for point geometries that fits the bill for this task: arcpy.PointGeometry.pointFromAngleAndDistance. Unfortunately, that likely isn't going to help you out here and now unless you are testing the prerelease version of it. In the absence of that new method, you can roll your own using basic trigonometry. Since we are rolling our own, I think it makes sense to create a function to go one step further and build the line:
def linefromPolarCoord(angle, distance, point, spatialReference=None): X = distance * math.cos(math.radians(90 - angle)) Y = distance * math.sin(math.radians(90 - angle)) return arcpy.Polyline( arcpy.Array([point, arcpy.Point(point.X + X, point.Y + Y)]), spatialReference )
The above function assumes basic 2D/projected data, nothing fancy with 3D distances or lines. The angle is degrees, distance is in units of the projection (meters for UTM), point is an arcpy.Point, and spatialReference is an arcpy.spatialReference for the projection being used.
Since you want to work in decimal increments and Python's range is integer based, I suggest rolling your own decimal-based sequence function. I like the accepted solution of drange on the following StackExchange question: Python decimal range() step value.
Assuming you have the 500 points in a point feature class (pointsFC) and a single water body polygon in a feature class (waterbodyFC), the following code might get you in the ballpark:
alpha = 22.5 distance = pointsFC = waterbodyFC = fetchesTable = 'fetches' arcpy.CreateTable_management("in_memory", fetchesTable) arcpy.AddField_management(fetchesTable, "POINTID", "LONG") arcpy.AddField_management(fetchesTable, "FETCH_ANGLE","DOUBLE") arcpy.AddField_management(fetchesTable, "FETCH_DISTANCE","DOUBLE") with arcpy.da.SearchCursor(waterbodyFC, "SHAPE@") as searchCur: for row in searchCur: pn = row SR = pn.spatialReference with arcpy.da.InsertCursor( fetchesTable,["POINTID", "FETCH_ANGLE", "FETCH_DISTANCE"] ) as insertCur: with arcpy.da.SearchCursor( pointsFC,["OID@","SHAPE@"] ) as searchCur: for row in searchCur: OID = row pt = row.firstPoint for i in drange(0, 360, alpha): pl = linefromPolarCoord(i, distance, pt, SR).intersect(pn,2) insertCur.insertRow([OID, i, pl.length])
The above code assumes the points are in the water body, and you are interested in the distance to the edge of the water body, i.e., the shore. Using intersect might not be the best option if the water body wraps back on itself or has lots of bays because that could give a length that is more than the length to the shore. That said, the code is meant to demonstrate a workflow that can be scripted.
The following shows the type of situation I caution about above:
My original code would generate a distance of 266.7m for the given point on the lake above with a fetch angle of 337.5 degrees. My guess is that you don't want the total length of the line, but the length of the first part ( Part(0) ) of the line, which is 96.16m.
Assuming the lines will be built from the point outwards, which I believe will always be the case with this code, the following change to the last loop will return the length of the first part and not the length of all the parts.
for i in drange(0, 360, alpha):
pl = linefromPolarCoord(i, distance, pt, SR).intersect(pn,2)
pl = arcpy.Polyline(pl.getPart(0), SR)
insertCur.insertRow([OID, i, pl.length])