Skip navigation
All People > Dan_Patterson > Py... blog > 2019 > January
2019

Task...

Identify the 2 highest elevation points in a series of polygons.

 

Purpose...

To win a bet amongst neighbors... to locate something like a tower... find observation points for visibility analysis... lots of reasons

 

Conceptualization...

  • Intersect the points with the polygons, retaining all attributes... output type... point
  • Add an X and Y field to the result so you have the point coordinates for later use (if you don't have them)
  • Delete any 'fluff' fields that you don't need, but keep the ID, KEY, OBJECTID fields from each. They will have a unique name for identification.
  • Split the result into groupings using the key fields associated with the polygons (see code)
  • Sort the groupings on a numeric field, like elevation (ascending/descending)     (code)
  • Slice the number that you need from each grouping.   (code)
  • Send it back out to a featureclass if you need it.    (code)

 

For Picture People...

A sample table resulting from the intersection of points with polygons.

 

The points (in red) and the polygons (a grid pattern) with the two points with the highest 'Norm' value in each polygon

 

An upclose look of the result

 

For the Pythonistas....

Fire up Spyder or your favorite python IDE

 

Table to NumPy Array....

>>>  a = arcpy.da.TableToNumPy("path to the featureclass or table", field_names="*")  # you now have an array

 

with me so far?  now you have an array

 

Make a script... call it something (splitter.py for example)

 

The work code... 

import numpy as np
import arcpy


def split_sort_slice(a, split_fld=None, val_fld=None, ascend=True, num=1):
    """Does stuff  Dan_Patterson@carleton.ca 2019-01-28
    """

    def _split_(a, fld):
        """split unsorted array"""
        out = []
        uni, idx = np.unique(a[fld], True)
        for _, j in enumerate(uni):
            key = (a[fld] == j)
            out.append(a[key])
        return out
    #
    err_0 = "The split_field {} isn't present in the array"
    if split_fld not in a.dtype.names:
        print(err_0.format(split_fld))
        return a
    subs = _split_(a, split_fld)
    ordered = []
    for i, sub in enumerate(subs):
        r = sub[np.argsort(sub, order=val_fld)]
        if not ascend:
            r = r[::-1]
        num = min(num, r.size)
        ordered.extend(r[:num])
    out = np.asarray(ordered)
    return out

# ---- Do this ----

in_fc = r"C:\path_to_your\Geodatabase.gdb\intersect_featureclass_name"

a = arcpy.da.TableToNumPyArray(in_fc, "*")

out = split_sort_slice(fn, split_fld='Grid_codes', val_fld='Norm', ascend=False, num=2)

out_fc = r"C:\path_to_your\Geodatabase.gdb\output_featureclass_name"

arcpy.da.NumPyArrayToFeatureClass(out, out_fc, ['Xs', 'Ys'], '2951')

 

Lines 40 - 42

NumPyArrayToFeatureClass—Data Access module | ArcGIS Desktop 

 

Open ArcGIS Pro, refresh your database and add the result to your map.

 

I will put this or a variant into...

 

Point Tools …


So that you can click away, for those that don't like to type.

Lines

 

Different incarnations and names

Pretty easy to form the origin-destination pairs.

Start at a point.

Throw in horizontal and/or vertical offsets.

A dash of an azimuth/bearing.

A tad of NumPy

A bit of Arcpy and....

A good way to spend some time, so you write it down because you will forget and reinvent it later.

Almost forgot...

 

There is always one student that thinks outside the box. 

Hmmmm could be a bonus here... I wonder if any of mine can replicate the compass with 10 degree increments?

In the attached code, I made these changes

    rads = np.deg2rad(bearing)
    dx = np.sin(rads) * dist
    dy = np.cos(rads) * dist
    #
    n = len(bearing)
    N = [N, n][n>1]  # either the number of lines or bearings

 

And used this

b = np.arange(0, 361, 22.5)
a, data =transect_lines(N=1, orig=[some x, some y],
                        dist=100, x_offset=0, y_offset=0,
                        bearing=b, as_ndarray=True)

You can't have it both ways in a manner of speaking.  By limiting N to number of bearings, you use numpy to generate the desired angles,.  There is no x or y offset since the origin is now fixed.

 

How to use the attached...

""---- use these as your inputs, with edits of course

# ---- make the x, y coordinate table
SR = 2951  # a projected coordinate system preferably

a, data =transect_lines(N=10, orig=[299000, 5000000], dist=100,
                        x_offset=10, y_offset=0, bearing=-10, as_ndarray=True)
p0 = r"C:\Your_path\Your.gdb\a_tbl"

arcpy.da.NumPyArrayToTable(a, p0)

# ---- now for the lines
p1 = r"C:\Your_path\Your.gdb\some_lines"

arcpy.XYToLine_management(p0, p1,
                          'X_from', 'Y_from',
                          'X_to', 'Y_to',
                          spatial_reference=SR)

"""

 

PS

The python/numpy part is quite speedy, using variants of

%timeit transect_lines(N=10, orig=[0,0], dist=1, x_offset=0, y_offset=0, bearing=0, as_ndarray=True)

That is microseconds for the speed geeks.  I couldn't see a use case to test for larger arrays.
N    Time             
    10     36.0 µs ± 309 ns per loop (mean ± std. dev. of 7 runs, 10000 loops each)
    50     39.3 µs ± 3.4 µs per loop (mean ± std. dev. of 7 runs, 10000 loops each)
  100     42.9 µs ± 6.57 µs per loop (mean ± std. dev. of 7 runs, 10000 loops each)
  500     46.5 µs ± 502 ns per loop (mean ± std. dev. of 7 runs, 10000 loops each)
1000     54.9 µs ± 1.39 µs per loop (mean ± std. dev. of 7 runs, 10000 loops each)
I didn't bother to test the featureclass creation since I have no control over that.