5 6 7 8 9

122 posts

# Reclassify raster data simply

Posted by Dan_Patterson Aug 14, 2016

Reclassifying raster data can be a bit of a challenge, particularly if there are nodata values in the raster.  This is a simple example of how to perform classifications using a sample array. (background 6,000+ rasters the follow-up  )

An array will be used since it is simple to bring in raster data to numpy using arcpy's:

•   RasterToNumPyArray
•   RasterToNumPyArray (in_raster, {lower_left_corner}, {ncols}, {nrows}, {nodata_to_value})

and sending the result back out using:

• NumPyArrayToRaster
• NumPyArrayToRaster (in_array, {lower_left_corner}, {x_cell_size}, {y_cell_size}, {value_to_nodata})

On with the demo...

Raster with full data

old                                new

[ 0  1  2  3  4  5  6  7  8  9]    [1 1 1 1 1 2 2 2 2 2]

[10 11 12 13 14 15 16 17 18 19]    [3 3 3 3 3 4 4 4 4 4]

[20 21 22 23 24 25 26 27 28 29]    [5 5 5 5 5 6 6 6 6 6]

[30 31 32 33 34 35 36 37 38 39]    [7 7 7 7 7 7 7 7 7 7]

[40 41 42 43 44 45 46 47 48 49]    [7 7 7 7 7 7 7 7 7 7]

[50 51 52 53 54 55 56 57 58 59]    [7 7 7 7 7 7 7 7 7 7]

``# basic structurea = np.arange(60).reshape((6, 10))a_rc = np.zeros_like(a)bins = [0, 5, 10, 15, 20, 25, 30, 60, 100]new_bins = [1, 2, 3, 4, 5, 6, 7, 8]new_classes = zip(bins[:-1], bins[1:], new_bins)for rc in new_classes:    q1 = (a >= rc[0])    q2 = (a < rc[1])    z = np.where(q1 & q2, rc[2], 0)    a_rc = a_rc + zreturn a_rc# result returned``

Lines 2, 3, 4 and 5 describe the array/raster, the classes that are to be used in reclassifying the raster and the new classes to assign to each class.  Line 5 simply zips the bins and new_bins into a new_classes arrangement which will subsequently be used to query the array, locate the appropriate values and perform the assignment (lines 6-10 )

Line 3 is simply the array that the results will be placed.  The np.zeros_like function essentially creates an array with the same structure and data type as the input array.  There are other options that could be used to create containment or result arrays, but reclassification is going to be a simple addition process...

• locate the old classes
• reclass those cells to a new value
• add the results to the containment raster/array

Simple but effective... just ensure that your new classes are inclusive by adding one class value outside the possible range of the data.

Line 10 contains the np.where statement which cleverly allows you to put in a query and assign an output value where the condition is met and where it is not met.  You could be foolish and try to build the big wonking query that handles everything in one line... but you would soon forget when you revisit the resultant the next day.  So to alleviate this possibility, the little tiny for loop does the reclassification one grouping at a time and adds the resultant to the destination array.  When the process is complete, the final array is returned.

Now on to the arrays/rasters that have nodata values.  The assignment of nodata values is handled by RasterToNumPyArray so you should be aware of what is assigned to it.

Raster with nodata values

old                                  new

[--  1  2  3  4  5  6 --  8  9]      [--  1  1  1  1  2  2 --  2  2]

[10 11 12 13 -- 15 16 17 18 19]      [ 3  3  3  3 --  4  4  4  4  4]

[20 -- 22 23 24 25 26 27 -- 29]      [ 5 --  5  5  5  6  6  6 --  6]

[30 31 32 33 34 -- 36 37 38 39]      [ 7  7  7  7  7 --  7  7  7  7]

[40 41 -- 43 44 45 46 47 48 --]      [ 7  7 --  7  7  7  7  7  7 --]

[50 51 52 53 54 55 -- 57 58 59]]     [ 7  7  7  7  7  7 --  7  7  7]

Make a mask (aka ... nodata values) where the numbers are divisible by 7 and the remainder is 0.

Perform the reclassification using the previous conditions.

``# mask the valuesa_mask = np.ma.masked_where(a%7==0, a)a_mask.set_fill_value(-1)# set the nodata value``

The attached sample script prints out the test with the following information:

Input array ... type ndarray

...snip...

Reclassification using

:  from [0, 5, 10, 15, 20, 25, 30, 60, 100]

:  to   [1, 2, 3, 4, 5, 6, 7, 8]

:  mask is False value is None

Reclassed array

...snip...

...snip

Reclassification using

:  from [0, 5, 10, 15, 20, 25, 30, 60, 100]

:  to   [1, 2, 3, 4, 5, 6, 7, 8]

:  mask is True value is -1

Reclassed array

...snip....

-------------------------------------

That's about all for now.  Check the documentation on masked arrays and their functions.  Most functions and properties that apply to ndarrays also apply to masked arrays... it's like learning a new language just by altering the pronounciation of what you already know.

# Anaconda, Spyder and ArcGIS PRO

Posted by Dan_Patterson Jul 17, 2016

Anaconda ... python ... spyder ... conda .... arcgis pro

Ok... change is inevitable... at least we still have IDEs, having harkened from the card reader days.

So this is a running visual of installing Spyder so I can use with ArcGIS PRO and for general python 3.4/5/6 work

I have used pythonwin and pyscripter for some time.

Some people are charmed by pycharm and so people can't wait to be IDLE. ... but for now, I will document Spyder.

I will add to this as I find more..

NOTE See the Comments section for other sample images of Spyder

2017-10-31 Anaconda 5.0 is out

see the changelog for details

2017-02-18 Anaconda 4.3 is out

see the full changelog for details...  some highlights...

2016-10-20 Conda and ArcGIS Pro | ArcPy Café

Things just got a whole load easier....

Current distribution supports up to and including python 3.6, and a nice new Spyder plus many more upgraded packages.  I am using this for future-proofing my work.  Arc* will eventually be there so you may as well test while you can.

-------------------------------------------------------------------------------------------------------------------------------------------------

I will leave the material below as a legacy record since much of it still applies

The original link on ArcGIS Pro and the changes to managing python environments can be found here

Related python information can also be found in

-

-  2016-07-15  importing xlsxwriter

-  2016-07-15  initial post

### :------------------------------------------------------------------------------------------------------------------------:

State the obvious..... Install ArcGIS PRO

Just follow the instructions.  Don't try and monkey-patch an old machine that barely runs ArcMap.  Start fresh.

I have never used Arc*'s built in IDE.  I am not sure why they include it, except for the fact that they can control its installation and no one needs to worry about problems associated with a separate  IDE.  I installed spyder, Jupyter Qt-console and PythonWin.  Spyder and Jupyter are included in the Anaconda distribution that esri provides.  If you don't use one now, then install spyder.

If you want to use another one, go to the Project pane, the Geoprocessing Options and do some setup.  Spyder is located in a somewhat cryptic folder path, which I have show in navigation mode and in step 2. as a visual with cutouts for the fiddle bits.  In reality, the portion of the path C:\Program Files\ArcGIS\Pro is the default installation path and the one most likely to be offered during installation and for multi-user machines.  I personally install mine where I know what it is in the folder.... C:\ArcPro.

:------------------------------------------------------------------------------------------------------------------------:

3. The file path to locate the executable

I assume a default path installation in the following.  Everything prior to the ...\bin folder is specific to your machine install path.

:------------------------------------------------------------------------------------------------------------------------:

4. The site-package folder

What is included without by default from an ArcGIS Pro installation... this is not a complete list of available packages... the list of those, is given above in the table.  The packages come by python version.  We are currently using python 3.4.x in ArcGIS PRO and ArcMap

:------------------------------------------------------------------------------------------------------------------------:

5. The Spyder folder contents

What is in the spyder folder... scientific_startup does some standard imports for what-it-says-type work

:------------------------------------------------------------------------------------------------------------------------:

6. The pkgs folder

A record of the packages that were installed.

:------------------------------------------------------------------------------------------------------------------------:

7.  Importing packages... xlsxwriter demo

Here is the workflow I used to import the xlsxwriter module for use in python and arcmap (presumably).

Here is the workflow I used to import the xlsxwriter module for use in python and arcmap (presumably).

From the start button (windows 10, bottom left) navigate to the ArcGIS folder via All Apps find the Python Command Prompt and right-click on it and Run as Administrator

Do the conda install xlswriter entry as suggested in the originating post.

Hit Enter and away you go. The magic happens and it should be installed.

At this stage, I went back to Spyder and from the IPython console I tested... looks good

:------------------------------------------------------------------------------------------------------------------------:

More as I find it...

# Sliding/Moving window operations in rasters and arrays

Posted by Dan_Patterson Jul 7, 2016

### Sliding/Moving windows ...

This is the companion to block functions introduced earlier.  I will keep it simple.  The stats functions for rasters with and without nodata values still apply to this type of treatment.  The only difference is how the sub-arrays are generated.  I won't get into the details of how they are done here, but the stride function uses some pretty tricky handling of the input array structure to parse out the moving windows.

Remember, block functions move one window size across and down, per movement cycle.  Sliding functions move one cell across and down and sample the cell based on the window size...hence, there are many more windows to sample in sliding function than in block functions.  I will just show the code snippets and you can play on your own.  I have posted sample timing results for comparative purposes for both implementations.

• The stride_tricks does the work.  You can examine the code in the ...site-packages, numpy lib folder.
• functools provides decorator capabilities.  I tend to import it by default since I use decorators a lot for timing and documentation functions.
• The textwrap module provide several functions, with dedent being useful for output documentation.
• The numpy set_printoptions function allows you to control the appearance of output.  The version here is quite simplified...refer to the documentation for more information.

Stride and block functions provide the means to sample the input array according to the type of data, the size and layout.  You must read the documentation and experiment in order to fully understand.  It really messes with your head and makes the Rubic's cube look simple. The decorator and timer functions I have talked about before.  Refer to my previous blogs for more information.

imports

``import numpy as npfrom numpy.lib.stride_tricks import as_stridedfrom functools import wrapsfrom textwrap import dedentnp.set_printoptions(edgeitems=3,linewidth=80,precision=2,                    suppress=True,threshold=100)``

slide (stride) and block functions

``#---- functions ----def _check(a, r_c):    """Performs the array checks necessary for stride and block.    : a   - Array or list.    : r_c - tuple/list/array of rows x cols.      :Attempts will be made to     :     produce a shape at least (1*c).  For a scalar, the    :     minimum shape will be (1*r) for 1D array or (1*c) for 2D    :     array if r<c.  Be aware    """    if isinstance(r_c, (int, float)):        r_c = (1, int(r_c))    r, c = r_c    a = np.atleast_2d(a)    shp = a.shape    r, c = r_c = ( min(r, a.shape[0]), min(c, shp[1]) )     a = np.ascontiguousarray(a)    return a, shp, r, c, tuple(r_c)    def stride(a, r_c=(3, 3)):    """Provide a 2D sliding/moving view of an array.      :  There is no edge correction for outputs.    :    :Requires    :--------    : a - array or list, usually a 2D array.  Assumes rows is >=1,    :     it is corrected as is the number of columns.    : r_c - tuple/list/array of rows x cols.  Attempts  to     :     produce a shape at least (1*c).  For a scalar, the    :     minimum shape will be (1*r) for 1D array or 2D    :     array if r<c.  Be aware    """    a, shp, r, c, r_c = _check(a, r_c)    shape = (a.shape[0] - r + 1, a.shape[1] - c + 1) + r_c    strides = a.strides * 2    a_s = (as_strided(a, shape=shape, strides=strides)).squeeze()        return a_sdef block(a, r_c=(3, 3)):    """See _check and/or stride for documentation.  This function    :  moves in increments of the block size, rather than sliding    :  by one row and column    :    """    a, shp, r, c, r_c = _check(a, r_c)    shape = (a.shape[0]/r, a.shape[1]/c) + r_c    strides = (r*a.strides[0], c*a.strides[1]) + a.strides    a_b = as_strided(a, shape=shape, strides=strides).squeeze()    return a_b``

timing decorator, decorated stat function and time test

``def delta_time(func):    """simple timing decorator function"""    import time    @wraps(func)    def wrapper(*args, **kwargs):        print("Timing function for... {}".format(func.__name__))        t0 = time.perf_counter()        # start time        result = func(*args, **kwargs)  # ... run the function ...        t1 = time.perf_counter()        # end time        print("Results for... {}".format(func.__name__))        print("  time taken ...{:12.9e} sec.".format(t1-t0))        #print("\n {}".format(result))  # print within wrapper        return result                   # return result    return wrapperdelta_timedef array_mean(a):    """change the func"""    a_mean = a.mean(axis=(2, 3))    return Nonedef time_test():    """time test for block and sliding raster"""    N = [100, 500, 1000, 2000, 3000, 4000, 5000]    for n in N:        a = np.arange(n*n).reshape((n, n))        #a0 = stride(a, block=(3, 3))  # uncomment this or below        a0 = block(a, block=(3, 3))        print("\nArray size... {0}x{0}".format(n))        a_stat = array_mean(a0)       # time function    return None``

main section

``if __name__=="__main__":    """comparison between block and slide view"""    prn=False    time_test()``

Description: This code does...timing  examples for sliding and block mean

Timing results

sliding mean                block mean

array size   total time    total time (seconds)

1000x1000   3.85e-02      4.65...e-03

2000x2000   1.51e-01      1.75...e-02

3000x3000   3.41e-01      3.86...e-02

4000x4000   6.37e-01      7.20...e-02

5000x5000   9.46e-01      1.04...e-01

The slowest is obviously the sliding function since it does 9 samples for ever 1 that block does since I used a 3x3 window...I think...it is the easy stuff that confuses.  Enjoy and study the code  and its origins should you be interested.  I should note that the whole area of sliding implementation has been quite the area of investigation.  The implementation that I presented here is one that I chose to use because I liked it.  It may not be the fastest or the bestest... but I don't care...if it takes less than a sip of coffee, it is good enough for me.

That's all for now....

# Python 3.5 comes to iThings

Posted by Dan_Patterson Jun 16, 2016

Yes ... for a whole \$7 Canadian, you too can do cutting edge programming on your iThing ... iPad 2 or above, the iPhone maybe even the iWatch. See ...  Pythonista for iOS

Well, if you don't have an iPhone, iPad or iWhatever to program on, you can keep up with conventional python innovations and update here:

Coming to Python.... preparation and anticipation

Oh... and os.walk alone, is so yesterday... os.scandir is the new kid providing speedups... but failing python 3.5, you can get it from github thankfully:

GitHub - benhoyt/scandir: Better directory iterator and faster os.walk(), now in the Python 3.5 stdlib

or just wait until 3.5 comes along.

Now ... the iStuff

I have mentioned the program Pythonista (available at the iStore) as a great program for doing programming.  I have been using the 2.7 version for about 2 years now and I am impressed with its IDE.  The new version runs both python 3.5 (yes... 3.5) and python 2.7.  It contains many, many modules by default, including numpy and MatPlotLib to name just a few.  You can ship your cool scripts off to github, OneDrive, all those cloud places.  You can even produce pdfs amongst other things.  A downside... some bizarre Apple policy  (no clue what it means) makes it just a tad hard to load a script directly... (but we all know how to copy and paste content don't we, or consult a 12 year old).

So, have a gander at your site-packages folder in your python distribution and just add more.  Do you like line numbering? code completion? indent/dedent? cool IDE themes (like the Dark Theme) and whole load of other stuff, then you might be interested.   SciPy and Pandas isn't included yet, so you will have to do your Voronoi Diagrams and Delauney Triangulations the old fashioned way through pure python and/or numpy or MatPlotlib.  Ooops, I forgot, most of you don't have it since you aren't using ArcGIS Pro yet, so you won't miss it (you will have to install Pro to get python 3.4.x and the SciPy stack with other goodies).

But how cool... standing at the bus stop... whip out the iPad and run a quick triangulation in Matplotlib... how to impress potential employers who might be watching.

I can't wait until esri gets something that allows me to do analysis on the iThings so I can work with arcpy, numpy and all my favorite libs and pys.  Imagine being able to graph data in the field as it is being collected...

Two screen grabs just to give you a quick view of other stuff.  I will update when I play with the new version more.

Sample script, using a partial split screen for output on an iPad.  Imagine how cool this would look on your iPhone

Alternate theme, for the moody programmer...

The python 2.7 and 3.5 distributions have their own access points with their own site_packages folder and even a shared one.  Kind of like a version of condas... maybe iCondas...

Later

... Dan

# Toolbox creation in ArcGIS Pro

Posted by Dan_Patterson May 19, 2016

A visual demo.  Nothing fancy, more for the record.

1 Begin by creating a toolbox

2  Specify a 'New' script to create a script in the toolbox

3  Now start filling out the dialog's General parameters.

4 The actual Parameters section needs to be created to include Label, Name, Data Type, whether it is

required,  optional or derived and whether it is an input or output parameter.

5  Kind of obvious what needs to be done... elapsed time so far, 2 minutes.

Some get a bit more complicated

6  You can fill out tool validation if you want.  I personally don't bother unless someone is paying.

7  Time for the help stuff...this is important though

8  Yes... the less than obvious save button will make things good.

You can include all kinds of stuff, like images etc.

Sidebar help

10  Oh yes... the handy little 'i' symbol shows the tool help if you just need a quick tip on the input parameters.

Total elapsed time from start to finish, less than 5 minutes.

--------------------------------------------------------------------------------------------------------------------------------------------

Extras

Multivalue parameters are possible

Store you toolbox, main script and helper script together

You can embed your script in the toolbox should you want to...just don't forget your password and keep a non-embedded version just in case.

Posted by Dan_Patterson May 8, 2016

2019-02-27 Release notes for ArcGIS Pro 2.3.1—ArcGIS Pro | ArcGIS Desktop   Issues addressed list

2019-01-25 Release notes for ArcGIS Pro 2.3—ArcGIS Pro | ArcGIS Desktop Issues addressed list

2019-01-14  What's new in ArcGIS Pro 2.3—ArcGIS Pro | ArcGIS Desktop  lots to read!!!

2019-01-14 Numpy 1.16 released  Last release supporting python 2.x, but only until 2020

2018-12-02  Spyder 3.3.2 has been released, use your conda update skills  https://support.esri.com/en/technical-article/000019162      New...

2018-09-28  Technical article on the ArcGIS Pro 2.2.2 rollback      New...

2018-09-27  Matplotlib 3.0 is released...   New...

2018-09-19   Working-with-packages-and-environments-in-Spyder  New...

A fantastic article on using one Spyder IDE installation for multiple environments

2018-09-19 Matplotlib 3.0 released New...

2018-08-30  Spyder... install once, use in multiple environments New... 2018-08-30

2018-08-16 Install Spyder, Jupyter console and Jupyter notebook for ArcGIS Pro by default  lets get on this one

2018-05-31  Beyond NumPy Arrays...  for those that haven't got there yet... just an FYI

2018-05-31 Threads and other cool stuff  for measurement, not sewing

.....  See the categories below for older posts

The Bug List and Change logs

Product information, main link including: Release notes, generic link to version fixes

ArcGIS ...:      :.10.6 issues addressed  :10.5.1 issues addressed  :.10.5.1 changes...  :. version 10.5 fixes...

ArcGIS Pro... :.  ArcGIS Pro 2.2 ... issues addressed...  :.

ArcGIS Python API  :.. ArcGIS API for Python.. V1.2  New

Python ...:      :. from 3.7 back  updated with  New

SciPy   ...:      :. release notes all versions

MatPlotLib...: :. :.  version 2.1 release, version 2 release

Pandas......    :. version 0.20 release

To be updated as I see fit.  I have posted a number of blog posts and documents and list them in reverse order.

They are categorized as follows:

------------------------------------------------------------------------------------------------------------------------

My toolboxes on ArcScripts 2.0 Beta  Link...

### -------- Esoterica:

------------------------------------------------------------------------------------------------------------------------

### -------- Documentation...

ArcGIS Pro Essential Workflows     ArcGIS Pro Essential Workflows - updated

Python 2 and 3 key differences       key differences link

Main documentation page                Documentation | ArcGIS for Desktop

Arcpy and geoprocessing                What is geoprocessing?—Help | ArcGIS for Desktop

A good source of code examples    http://stackoverflow.com/

Esri at GitHub                                    Esri GitHub ...

NumPy/SciPy documentation          NumPy — Numpy

Python for iThingy's                          Pythonista    comes with matplotlib, numpy and loads more

Advanced Python notes                    Python Scientific Lecture Notes — Scipy lecture notes

Matplotlib the graphing package       matplotlib: python plotting — Matplotlib 1.4.3 documentation

PythonPedia                                      https://pythonpedia.com/

Awesome Python                              http://awesome-python.com/

ArcGIS Pro, SciPy, Python...            Python en ArcGIS Pro - CCU2015.pdf

SciPy Lecture Notes                          Scipy Lecture Notes

How to make mistakes in python    How to make mistakes in Python... a useful link

--------

Posted by Dan_Patterson Apr 29, 2016

The attached pdf will serve for now.  I will add additional documentation here on how to work with rasters with nodata areas here soon.

A simple example here, using Matplotlib to do the mapping.  The -- cells are nodata values

Raster/array valuesSample properties
```>>> print(c.filled())  # -9 is nodata
[[ 3  1  3 -9  3  2 -9 -9 -9  2  2  2  3 -9  1 -9  3  3 -9  1]
[ 1  3  2  3 -9 -9 -9  3  1  3  3 -9  3  2  3  3  3  3  1  1]
[ 2  2  2 -9  1  2  2 -9  2  2  3 -9  3 -9  3 -9  2  2  1  3]
[ 3  1  3  2 -9 -9  2 -9  2 -9 -9  1  2  1  1  2  1  3  3 -9]
[-9  2 -9  2 -9 -9  1  2  1 -9 -9  2  2  1  1  1  3  1  3  3]
[-9  2  3  3  1  2  2  3 -9  1  1  3  1 -9 -9 -9  2  1  3 -9]
[-9 -9 -9  2  1  1 -9  2  2  1  1  2  2  3  2  3  2  2  2 -9]
[ 3 -9  3 -9 -9  2  3 -9  3  2  2  2  1 -9  3  2 -9  2  2  1]
[ 1 -9  2  2  1 -9  2  1  2  2 -9 -9  3 -9  2  2 -9  1 -9  1]
[ 1  1 -9 -9 -9  2 -9  2  3  2 -9  1  2  1  3  1 -9 -9  1  3]
[ 1 -9 -9  1  2  1 -9  1 -9 -9 -9 -9  1 -9 -9 -9 -9  2 -9  3]
[-9  3 -9 -9 -9  2  3 -9 -9  1  2  1  1  2  1  1  3  2  3  2]
[ 3  3  3 -9  3  1  3 -9 -9 -9  3  2 -9 -9  3  2  3 -9  1 -9]
[-9  2  2  3  3  1  3  1 -9 -9  2  3  3  1  1  1  1  1 -9  1]
[-9  1 -9  3  1  1 -9  2 -9  1  1  2  2  1 -9  2  2  3 -9  3]
[ 3  2 -9  2  2 -9 -9  2  1  2  1  2 -9  3  2  1  1  1  3  1]
[-9 -9  3  2  2 -9  2  2  1  2 -9  3  1  2  2 -9  3  3  2  1]
[ 1 -9  1 -9  2  2  3 -9  3  2  2 -9  1  2  3 -9 -9 -9  3  3]
[ 3  1  1  1  2  1  2 -9 -9  2  2  1  2 -9 -9 -9  2 -9  3  2]
[-9  2  3  1 -9  1  1 -9  2 -9  1  1  1  2  1  2  3 -9 -9  3]]
```

>>> c.mean()

1.96028880866426

>>> c.min()  = 1

>>> c.max() = 3

>>>

np.histogram(c, bins=[1,2,3,4])

(array([ 92, 104,  81]),

array([1, 2, 3, 4]))

# Distance Calculations Revisited...

Posted by Dan_Patterson Apr 20, 2016

There are a large number of questions that deal with some aspect of distance.  If you are interested in quick distance calculations, then download and read the attachment.

The tools are builtin to arcpy to get the data into this form to facilitate calculations.

You can determine distances using different metrics... euclidean distance is shown here.

If you need to find the nearest, a list of closest objects or to calculate the perimeter of a polygon or length of a line represented by a sequence of points, there are other options available to you.

Associated references:

Python Near Analysis (No Advanced Licence)

Single origin to multiple destinationsMultiple origin to multiple destinations

The example here uses a single origin in an origin list.  You will note that the coordinates are given as a list, of lists, of coordinates.

Example 1...

```Single origin...
[[ 0.  0.]]
Multiple destinations
[[ 4.  0.]
[ 0.  2.]
[ 2.  2.]
[-2. -3.]
[ 4.  4.]]
Distances: [ 4.    2.    2.83  3.61  5.66]
```

An example for 3D length/distance

X, Y and Z values array, their differences and the resultant distances and the length if they formed a circuit

```e_leng(d3d,verbose=True)
Input array....(shape=(1, 8, 3))
[[[ 0.  0.  0.]
[ 1.  1.  1.]
[ 0.  1.  0.]
[ 1.  0.  1.]
[ 0.  1.  1.]
[ 1.  1.  0.]
[ 1.  0.  0.]
[ 0.  0.  1.]]]
differences...

[[[-1. -1. -1.]
[ 1.  0.  1.]
[-1.  1. -1.]
[ 1. -1.  0.]
[-1.  0.  1.]
[ 0.  1.  0.]
[ 1.  0. -1.]]]
distances...
[[ 1.73  1.41  1.73  1.41  1.41  1.    1.41]]
length...[ 10.12]
```

This example  uses the same destinations.

```Example 2...
Multiple Origins...
[[[ 0.  0.]]
[[ 1.  0.]]

[[ 0., 1.]]
[[ 1.  1.]]]

Destinations...
[[ 4.  0.]
[ 0.  2.]
[ 2.  2.]
[-2. -3.]
[ 4.  4.]]

Distances...
[[ 4.    2.    2.83  3.61  5.66]
[ 3.    2.24  2.24  4.24  5.  ]
[ 4.12  1.    2.24  4.47  5.  ]
[ 3.16  1.41  1.41  5.    4.24]]

Origin-Destination, distance matrix
dests->:     0     1     2     3     4
origins
0: [ 4.    2.    2.83  3.61  5.66]
1: [ 3.    2.24  2.24  4.24  5.  ]
2: [ 4.12  1.    2.24  4.47  5.  ]
3: [ 3.16  1.41  1.41  5.    4.24]
```

That's all for now...

# Coming to Python.... preparation and anticipation

Posted by Dan_Patterson Apr 11, 2016

I always forget.  I read about something pythonic, then quasi-remember what it was I read.  So this is what is coming to python.  A look into the future.  Preparing for the future.

2019-02-21 Python 3.8 beta is out  Starting to run out of dot numbers

2018-06-28 Python 3.7 official release  Yes it is here... just add 1.0 to the version you are working on

Lots of speed improvements for many widely used functions.  Python 3.8 is on the way.

2018-03-31  numbers

Hey almost forgot this one.  Ever tried to type in UTM coordinates for those that live in the 6 figure northing range.  You poke at the screen counting zeros to make sure you get the northing right and don't end up near the equator.  Along comes underscores in numbers.

underscores in numbers
``N = 5_025_000.01N5025000.01``

Cool ehh? No more dirty monitors.  There is more in the specs for hex and nibbly bits etc

Underscores in Numeric Literals for those that are interested

--------------------------------------------------------------------------------------------------------------------------------------------------------

--------------------------------------------------------------------------------------------------------------------------------------------------------

ArcGIS Pro 2.1 ships with python 3.6.2    2018-01-17

Python 3.0 released  on December 3rd, 2008.

Python 3.6 expected final  December 12th, 2016    8 years old!

Python 3.7 beta released  September 23rd, 2017

Exiting 2.7... supporters and timeline 2020.... set your clocks

Python 3.7.0a math.remainder ...

Numpy 1.13 release notes  updated

Apparently the wait will continue arcmap 10.5 python version??

Recent blog  python 3.6... not using python 3 yet?....

Python 3.7 beta...  Overview — Python 3.7.0a0 documentation

Conda tips and tools...  Conda and ArcGIS Pro | ArcPy Café

Python 3.6 final coming soon... sorted dictionary keys and more  Release information and changes

Anaconda, Spyder and PRO Anaconda, Spyder and ArcGIS PRO

Python & Pro  Python and ArcGIS Pro 1.3 : Conda

SciPy.org    SciPy.org — SciPy.org latest release for numpy 1.11.1 and SciPy 0.17.1

Learning resourcesData Science in Python

GitHub - Data Science Python: common data analysis and machine learning tasks using python

A classic example is those of you that have avoided reading:

Oh wait!! Many of you are still using python 2.7.  Good news! Install ArcGIS PRO and enter the realm of Python 3.4.  Yes I did say 3.4. but it gets worse.... Python 3.7 is in beta as of Sept 2016.

Python 2.7.x is the last of the 2.x series.  It will remain on support for some time (to indefinitely), but there have been lots happening since Python 3.0 was introduced (homework... when was python 3.0 introduced).

So this is not going to be a retrospective going back to python 3.0 and what it introduced, but I will highlight what is going forward in the python world since python 3.6 is set for final release soon.. and you will be using python 3.4 ... IF you install ArcGIS PRO or dabble in the neverworld of alternate installs.  So in reverse order, this is what I think will be useful for you to look forward to, and what you may have missed.

## -------------------------------------------------------------------------------------------------

Contents : Python  NumPy  SciPy  Matplotlib  Pandas       Update:  2017-02-18

Last Update:   numpy 1.12 and scipy 0.17.1 and

added Pandas 0.18.1 release What’s New — pandas 0.18.1 documentation

## Python section

The main link  What’s New in Python ... this goes back in history

Pre-existing functionality  What is in version 3 that existed in version 2.6

with statement, print as a function, io module

----- python 3.8 ---------------------------------------------------------------------------------------

What’s New In Python 3.8 ... main page
Highlights

----- python 3.7 ---------------------------------------------------------------------------------------
What’s New In Python 3.7 ... main page
Highlights

----- python 3.6 ---------------------------------------------------------------------------------------

What’s New In Python 3.6 ... main page

Highlights

>>> name = "Fred"

>>> f"He said his name is {name}."

'He said his name is Fred.'

---- python 3.5 -------------------------------------------------------------------------------------------

What’s New In Python 3.5 ...main page

• Highlights

>>> *range(4), 4                        (0, 1, 2, 3, 4)

>>> [*range(4), 4]                      [0, 1, 2, 3, 4]

>>> {*range(4), 4, *(5, 6, 7)}      {0, 1, 2, 3, 4, 5, 6, 7}

>>> {'x': 1, **{'y': 2}}                 {'x': 1, 'y': 2}

• Improved Modules
• link includes, but not limited to:
• collections, csv, distutils, doctest, enum, inspect, io, json, locale, logger, math, os, pathlib, re, readline, sys, time, timeit, trackback, types, zipfile
• collections.OrderedDict is now implemented in C which makes it 4 to 100 times faster.

• csv.write_rows now supports any iterable
• enum.Enum now supports a start number for text enumeration
• math ... added nan and inf constants (finally!!!) and gcd (get common divisor)
• os.walk  significant speed improvements

---- python 3.4 -----------------------------------------------------------------------------------------

What's New in Python 3.4 ... main page

• Highlights

>>> import statistics

>>> dir(statistics)

[ .... snip ... 'mean', 'median', 'median_grouped', 'median_high', 'median_low', 'mode', 'pstdev', 'pvariance', 'stdev', 'variance']

• Improved Modules
• link includes, but not limited to:  collections, inspect, multiprocessing, operator, os, re, sys, textwrap, zipfile
• textwrap adds   maxlines, placeholder and shorten

---- python 3.3 and before  -------------------------------------------------------------------------------

Previous stuff (aka pre-3.4)

• porting to 2to3            2to3 - Automated Python 2 to 3 code translation
• changes to or replacement of :  apply, asserts, dict (dict.items(), dict.keys(), dict.values()), exec, has_key, isinstance, itertools, long, map, next, nonzero, operator, print, raise, range, raw_input, unicode, xrange, zip
• virtual environments    pep 405
• python launcher          pep 397
• ordered dictionaries    pep 372
• print as a function       pep 3105
• sysconfig                    Python’s configuration information
• argparse
• order dictionaries
• print statement
• division  / float, // integer

way more ..... you will just have to read the original migration docs.

Significant changes to text handling and the migration to full unicode support.

Some of the changes have been implemented in python 2.7 and/or can be accessed via ... from future import ....

## Anaconda section

Anaconda, Spyder and ArcGIS PRO

Python and ArcGIS Pro 1.3 : Conda

## NumPy section

The reverse chronological list of changes to numpy Release Notes — NumPy v1.13 Manual

numpy 1.13- - new functions... np.isin, np.divmod, np.block, plus others

- deprecations and changes are quite long... many improvements to subclass carry forwards

- too many more to list.

numpy 1.12- einsum optimized for speed

- keepdims added to many functions

- axis keyword for rot90

- flipud and fliplr now have axis generalization

- nancumsum and nancumprod added to the nan line of functions

- too many more to list

numpy 1.10

np.rollaxis, np.swapaxis  now return views

np.ravel, np.diagonal, np.diag  now preserve subtypes

recarray field and view  changes in how data are treated, see documentation

matrix @ operator implemented in keeping with its implementation in python 3.5

## SciPy section

Too many to list, but the change logs are given here Release Notes — SciPy v0.17.0 Reference Guide

SciPy central  http://central.scipy.org/

## Matplotlib section

chronologic change history   What’s new in matplotlib — Matplotlib 1.5.1 documentation

## Pandas section

Pandas 0.18.1         What’s New — pandas 0.18.1 documentation

The full list is here   pandas main page and release notes

# Creating data for testing purposes

Posted by Dan_Patterson Apr 4, 2016

Have you every come across a situation like one of these:

• you need to test out something but don't have the data
• are you sick of trying to get a function to work in the field calculator
• you want to test out one of ArcMap's functions but none of your data are suitable
• all I need are some points which have a particular distribution
• someone forgot to post a sample of their data on GeoNet for testing and you don't have a match
• you forgot to collect something in the field

Well, this lesson is for you.  It is a culmination of a number of the previous lessons and a few
NumPy Snippets and Before I Forget posts.  I have attached  a script to this post below

There is also a GitHub repository that takes this one step further providing more output options... see Silly on GitHub

The following provides the basic requirements to operate a function should you choose not to
incorporate the whole thing.  Obviously, the header section enclosed within triple quotes
isn't needed but the import section is.

``# -*- coding: UTF-8 -*-""":Script:   random_data_demo.py:Author:   Dan.Patterson AT carleton.ca:Modified: 2015-08-29:Purpose::  Generate an array containing random data.  Optional fields include::  ID, Shape, text, integer and float fields:Notes::  The numpy imports are required for all functions"""#-----------------------------------------------------------------------------# Required importsfrom functools import wrapsimport numpy as npimport numpy.lib.recfunctions as rfnnp.set_printoptions(edgeitems=5, linewidth=75, precision=2,                    suppress=True,threshold=5,                    formatter={'bool': lambda x: repr(x.astype('int32')),                               'float': '{: 0.2f}'.format})#-----------------------------------------------------------------------------# Required constants  ... see string module for othersstr_opt = ['0123456789',           '!"#\$%&\'()*+,-./:;<=>?@[\\]^_`{|}~',           'abcdefghijklmnopqrstuvwxyz',           'ABCDEFGHIJKLMNOPQRSTUVWXYZ',           'abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ'           ]#-----------------------------------------------------------------------------# decoratordef func_run(func):    """Prints basic function information and the results of a run.    :Required:  from functools import wraps    """    @wraps(func)    def wrapper(*args,**kwargs):        print("\nFunction... {}".format(func.__name__))        print("  args.... {}\n  kwargs.. {}".format(args, kwargs))        print("  docs.... \n{}".format(func.__doc__))        result = func(*args, **kwargs)        print("{!r:}\n".format(result))  # comment out if results not needed        return result                    # for optional use outside.    return wrapper#-----------------------------------------------------------------------------# functions``

Before I go any further, lets have a look at the above code.

• line 14          - functools wraps module -  I will be using decorators to control output and wraps handles all the fiddly stuff in decorators (see Before I Forget # 14)
• line 16 -        - numpy.lib.recfunctions is a useful module for working with ndarrays and recarrays in particular...it is imported as rfn
• lines 17-20  - np.set_printoptions allows you to control how arrays are formatted when printing or working from the command line.  Most of the  parameters are self-explanatory or you will soon get the drift
• lines 30 - 43 - the decorator function presented in BIF # 14.

Now back to the main point.  If you would like to generate data with some control on the output.

This will present some functions to do so and put it together into a standalone table or feature class.

An example follows:

``Array generated....array([(0, (7.0, 1.0), 'E0', '(0,0)', 'A', 'ARXYPJ', 'cat', 'Bb', 0, 9.380410289877375),       (1, (2.0, 9.0), 'D0', '(4,0)', 'B', 'RAMKH', 'cat', 'Aa', 9, 1.0263298179133362),       (2, (5.0, 8.0), 'C0', '(1,0)', 'B', 'EGWSC', 'cat', 'Aa', 3, 2.644448491753841),       (3, (9.0, 7.0), 'A0', '(1,0)', 'A', 'TMXZSGHAKJ', 'dog', 'Aa', 8, 6.814471938888746),       (4, (10.0, 3.0), 'E0', '(1,0)', 'B', 'FQZCTDEY', '-1', 'Aa', 10, 2.438467639965038)],        ............. < snip >      dtype=[('ID', '<i4'), ('Shape', [('X', '<f8'), ('Y', '<f8')]),              ('Colrow', '<U2'), ('Rowcol', '<U5'), ('txt_fld', '<U1'),              ('str_fld', '<U10'), ('case1_fld', '<U3'), ('case2_fld', '<U2'),              ('int_fld', '<i4'), ('float_fld', '<f8')])``

Here are the code snippets...

 Code snippets ``def pnts_IdShape(N=10, x_min=0, x_max=10, y_min=0, y_max=10):    """  Create an array with a nested dtype which emulates a shapefile's    : data structure.  This array is used to append other arrays to enable    :  import of the resultant into ArcMap.  Array construction, after hpaulj    :  http://stackoverflow.com/questions/32224220/     :    methods-of-creating-a-structured-array    """    Xs = np.random.random_integers(x_min, x_max, size=N)    Ys = np.random.random_integers(y_min, y_max, size=N)    IDs = np.arange(0, N)    c_stack = np.column_stack((IDs, Xs, Ys))    if simple:     # version 1        dt = [('ID', '

The above functions can be used with the main portion of the script and your own function.

 Sample function ``# required imports# required constants# pnts_IdShape  function# rand_case  function# rand_int  functiondef blog_post():    """sample run"""    N = 10    id_shape = pnts_IdShape(N,x_min=300000,x_max=300500,y_min=5000000,y_max=5000500)    case1_fld = rand_case(N,cases=['cat','dog','fish'],p_vals=[0.6,0.3,0.1])    int_fld = rand_int(N,begin=0,end=10)    fld_names = ['Pets','Number']    fld_data = [case1_fld,int_fld]    arr = rfn.append_fields(id_shape,fld_names,fld_data,usemask=False)    return arrif __name__ == '__main__':    """create ID,Shape,{txt_fld,int_fld...of any number}    """    returned = blog_post()`` ``array([(0, (300412.0, 5000473.0), 'dog', 4),       (1, (300308.0, 5000043.0), 'cat', 4),       (2, (300443.0, 5000170.0), 'dog', 5),       (3, (300219.0, 5000240.0), 'cat', 0),       (4, (300444.0, 5000067.0), 'cat', 9),       (5, (300486.0, 5000106.0), 'cat', 3),       (6, (300242.0, 5000145.0), 'cat', 5),       (7, (300038.0, 5000341.0), 'dog', 7),       (8, (300335.0, 5000495.0), 'cat', 9),       (9, (300345.0, 5000108.0), 'fish', 7)],       dtype=[('ID', '

You will notice in the above example that the rand_case function was to determine

the number of pets based upon p-values of 0.6, 0.3 and 0.1, with cats being favored, as they should be, and
this is reflected in the data.  The coordinates in this example were left as integers, reflecting a 1m resolution.

It is possible to add a random pertubation of floating point values in the +/- 0.99 to add centimeter values if you desire.

This is not shown here, but I can provide the example if needed.

The 'Number' field in this example simply reflects the number of pets per household.

Homework...

Using NumPyArrayToFeatureclass, create a shapefile using a NAD_1983_CSRS_MTM_9 projection

``>>> import arcpy>>> a = blog_post()  # do the run if it isn't done>>> # ..... snip ..... the output>>> # ..... snip ..... now create the featureclass>>> SR_name = 32189  # u'NAD_1983_CSRS_MTM_9'>>> SR = arcpy.SpatialReference(SR_name)>>> output_shp ='F:/Writing_Projects/NumPy_Lessons/Shapefiles/out.shp'>>> arcpy.da.NumPyArrayToFeatureClass(a, output_shp, 'Shape', SR)``

Result

That's all...

# Viewing data from a different perspective

Posted by Dan_Patterson Mar 29, 2016

Data can be viewed in a variety of ways.  This is example of how to view data.  The examples I am going to use are arrays, but they can equally be replaced with small maps or other forms of data.

Picture this... a 3D list of data, like observations for 3 people.  A simple printout of the data is pretty easy to digest if the set is small and the dimensions are few.  Here are a couple of options.  The first column (simple 3D array) is the standard presentation.  The second column adds a bit more information and puts the dimension number in place of the blank row used previously, which would be handy if the list were long.  The last column reshuffles the data so that each entry into the thrid dimension is a column block and it consolidates the number of rows needed to show data that are smaller in their columns with respect to the number of rows.

The array... (a_3d) ...

```[[[ 7  2 11]
[ 2  2  5]
[ 7  0  1]
[ 8  5  4]
[ 8  8  6]
[10  4  0]]

[[ 7 13  5]
[10 11  4]
[11  0  2]
[ 8  5 11]
[ 7  2  0]
[ 2 11  6]]

[[ 9  6  9]
[ 5  7  0]
[12  9  6]
[ 3  4 10]
[ 9 14  1]
[ 7  3 11]]]
```

de_line(a_3d) with extra info...

```array...
shape (3, 6, 3) ndim 3 size 54
a[0]...
[[[ 7  2 11]
[ 2  2  5]
[ 7  0  1]
[ 8  5  4]
[ 8  8  6]
[10  4  0]]
a[1]....
[[ 7 13  5]
[10 11  4]
[11  0  2]
[ 8  5 11]
[ 7  2  0]
[ 2 11  6]]
a[2]....
[[ 9  6  9]
[ 5  7  0]
[12  9  6]
[ 3  4 10]
[ 9 14  1]
[ 7  3 11]]]
```

to_row(a_3d)

```Array...  3D   r  c
shape: (3, 6, 3) size: 54
a[0]......  a[1]......  a[2]......
[ 7  2 11]  [ 7 13  5]  [ 9  6  9]
[ 2  2  5]  [10 11  4]  [ 5  7  0]
[ 7  0  1]  [11  0  2]  [12  9  6]
[ 8  5  4]  [ 8  5 11]  [ 3  4 10]
[ 8  8  6]  [ 7  2  0]  [ 9 14  1]
[10  4  0]  [ 2 11  6]  [ 7  3 11]

```

The tabular data shown in the last column can be presented in graphical form if the arrays are fairly simple.

Color would be cool, with or without labels, or you can skip the fill and simply use the labels.

Of course this can be applied to higher or lower dimensions as in the case of the 4D array below, which consists of 5 3D arrays.

4D shaping

```Array... 4D  shape: (5, 3, 6, 3) size: 270
3D subarrays: 5

Array...  3D   r  c
shape: (3, 6, 3) size: 54
a[0]......  a[1]......  a[2]......
[ 7  2 11]  [ 7 13  5]  [ 9  6  9]
[ 2  2  5]  [10 11  4]  [ 5  7  0]
[ 7  0  1]  [11  0  2]  [12  9  6]
[ 8  5  4]  [ 8  5 11]  [ 3  4 10]
[ 8  8  6]  [ 7  2  0]  [ 9 14  1]
[10  4  0]  [ 2 11  6]  [ 7  3 11]

Array...  3D   r  c
shape: (3, 6, 3) size: 54
a[0]......  a[1]......  a[2]......
...
SNIP
...
[16 10  6]  [ 8 17 12]  [13  9 17]

Array...  3D   r  c</p>
shape: (3, 6, 3) size: 54
a[0]......  a[1]......  a[2]......
[15 10 19]  [15 21 13]  [17 14 17]
[10 10 13]  [18 19 12]  [13 15  8]
[15  8  9]  [19  8 10]  [20 17 14]
[16 13 12]  [16 13 19]  [11 12 18]
[16 16 14]  [15 10  8]  [17 22  9]
[18 12  8]  [10 19 14]  [15 11 19]
```

I suppose it is a matter of perspective which format you prefer.  Comments welcome prior to posting my code in case people have views they would like to see.

# Slope, Aspect and Hillshade... No SA?

Posted by Dan_Patterson Mar 12, 2016

Like is says.

Modified:  2016-12-05

Simplified aspect calculation and added the ability of how to classify 'flat' during aspect calculations.  this should clean up those pesky random seemingly random aspect values in areas of small slope.

This is simply an implementation of the 3rd order finite difference method of determining slope and aspect (Horton 1981, Burrough et al. various) and used by the Spatial Analyst extension.  It is also the base for a variety of terrain derivatives.  I have improved the base algorithm to facilitate using larger arrays/rasters in a shorter period of time.  This implementation doesn't account for nodata values... I will leave that for the Code Sharing site.  I will leae the code documentation and discussion attached, but bear in mind it is much slower than the code below... but the fundamentals are the same.

This approach provides an estimate of slope which attempts to account for the cell values (ie elevations) in the 8 neighbours surrounding a core cell.  A measure of the slope in both the X and Y directions is made and averaged.

Alternate measures of slope can be simply implemented using variants of the fundamental algorithms documented in the attachment.  The author has also worked with implementations of simple hillshading and related neighbour functions which employ the same or similar methods of parsing 2D arrays into a 4D array.

The block or tiling process that is used to produce the subarrays has been documented in previous threads and here.

-

Sample surface...

Slopes form a pyramid. Edges trimmed...

``>>> Dem...[[0 1 2 3 2 1 0] [1 2 3 4 3 2 1] [2 3 4 5 4 3 2] [3 4 5 5 5 4 3] [2 3 4 5 4 3 2] [1 2 3 4 3 2 1] [0 1 2 3 2 1 0]]Slope...[[ 15.79  15.79  11.31  15.79  15.79] [ 15.79  13.9    8.53  13.9   15.79] [ 11.31   8.53   0.     8.53  11.31] [ 15.79  13.9    8.53  13.9   15.79] [ 15.79  15.79  11.31  15.79  15.79]]Aspect...[[ 315.  315.    0.   45.   45.] [ 315.  315.    0.   45.   45.] [ 270.  270.  270.   90.   90.] [ 225.  225.  180.  135.  135.] [ 225.  225.  180.  135.  135.]]``

Slope calculations... Aspect calculations

``>>> # ---- Slope calculation using>>> # finite difference ---->>>>>>Slope face...[[1 1 1][3 3 3][5 5 5]]N     dx    deg.    %(1 )   0.5   71.6  400.0(2 )   1.0   56.3  200.0(3 )   2.0   36.9  100.0(4 )   4.0   20.6   50.0(5 )   6.0   14.0   33.3(6 )   8.0   10.6   25.0(7 )  10.0    8.5   20.0(8 )  12.5    6.8   16.0(9 )  15.0    5.7   13.3(10)  20.0    4.3   10.0(11)  25.0    3.4    8.0(12)  40.0    2.1    5.0(13)  80.0    1.1    2.5(14) 100.0    0.9    2.0>>> # ---- dx is the cell width ---->>> # It is a north facing slope,>>> # with dx=4 and dz=2, you get a>>> # 20.6 deg. or 50% slope>>>``

``>>> # ---- Some windows with>>> #   aspect determination ---->>> (0) Array aspect...0.0[[1 0 1] [2 2 2] [3 4 3]](1) Array aspect...315.0[[0 1 2] [1 2 3] [2 3 4]](2) Array aspect...270.0... snip ...(4) Array aspect...180.0[[3 4 3] [2 2 2] [1 0 1]](5) Array aspect...135.0[[4 3 2] [3 2 1] [2 1 0]](6) Array aspect...90.0[[3 2 1] [4 2 0] [3 2 1]](7) Array aspect...45.0[[2 1 0] etc... etc ... ``

Now for some functions

The code below provides a check on the arrays then a few helper functions before the slope_a and aspect_a functions.  A demo function is used to show the results for a 3xN window with know properties.  I have used this easily on arrays in the 1000's  in both x and y.

``import sysimport numpy as npfrom numpy.lib.stride_tricks import as_stridedfrom textwrap import dedent, indentimport matplotlib.pyplot as pltft = {'bool': lambda x: repr(x.astype('int32')),      'float': '{: 0.1f}'.format}np.set_printoptions(edgeitems=10, linewidth=100, precision=2,                    suppress=True, threshold=200,                     formatter=ft)np.ma.masked_print_option.set_display('-')def _check(a, r_c, subok=False):    """Performs the array checks necessary for stride and block.    : see arr_tools    """    if isinstance(r_c, (int, float)):        r_c = (1, int(r_c))    r, c = r_c    if a.ndim == 1:        a = np.atleast_2d(a)    r, c = r_c = (min(r, a.shape[0]), min(c, a.shape[1]))    a = np.array(a, copy=False, subok=subok)    return a, r, c, tuple(r_c)def stride(a, r_c=(3, 3)):    """Provide a 2D sliding/moving view of an array.      :  see arr_tools    """    a, r, c, r_c = _check(a, r_c)    shape = (a.shape[0] - r + 1, a.shape[1] - c + 1) + r_c    strides = a.strides * 2    a_s = (as_strided(a, shape=shape, strides=strides)) #.squeeze()    return a_sdef filter_a(a_s, filter=None, cell_size=1):    """Used by aspect, slope and hillshade to filter a raster/array    :Requires:    :--------    : a_s - a r*c*3x3 strided array    : filter - a 3x3 filter to apply to a_s    : cell_size - for slope (actual size*8, aspect (8 is required)    """    f_dxyz = filter    cell_size= cell_size*8.0    if filter is None:        f_dxyz = np.array([[1, 2, 1], [2, 0, 2], [1, 2, 1]], dtype="float64")    a_s = a_s * f_dxyz    dz_dx = (a_s[...,:,2] - a_s[...,:,0]).sum(axis=-1)/cell_size    dz_dy = (a_s[...,2,:] - a_s[...,0,:]).sum(axis=-1)/cell_size    return dz_dx, dz_dydef slope_a(a, cell_size=1, degrees=True, verbose=False, keepdims=False):    """Return slope in degrees for an input array using 3rd order    :  finite difference method for a 3x3 moing window view into the array.    :    :Requires:    :--------    : a - an input 2d array. X and Y represent coordinates of the Z values    : cell_size - must be in the same units as X and Y    : degrees - True, returns degrees otherwise radians    : verbose - True, to print results    : keepdims - False, to remove/squeeze extra dimensions    : filter - np.array([[1, 2, 1], [2, 0, 2], [1, 2, 1]]) **current default    :    :Notes:    dzdx: sum(col2 - col0)/8*cellsize    :-----     dzdy: sum(row2 - row0)/8*celsize    : Assert the array is ndim=4 even if (1,z,y,x)    :   general         dzdx      +    dzdy     =    dxyz    :   [[a, b, c],  [[1, 0, 1],   [[1, 2, 1]       [[1, 2, 1]    :    [d, e, f]    [2, 0, 2], +  [0, 0, 0]   =    [2, 0, 2],    :    [g, h, i]    [1, 0, 1]]    [1, 2, 1]]       [1, 2, 1]]    :    """    frmt = """\n    :----------------------------------------:    :{}    :input array...\n    {}    :slope values...\n    {!r:}    :----------------------------------------:    """    # ---- stride the data and calculate slope for 3x3 sliding windows    a_s = stride(a, r_c=(3,3))    if a_s.ndim < 4:        new_shape = (1,) * (4-len(a_s.shape)) + a_s.shape        a_s = a_s.reshape(new_shape)    r = a_s.shape[0]    c = a_s.shape[1]    f_dxyz = np.array([[1, 2, 1], [2, 0, 2], [1, 2, 1]], dtype="float64")    # ---- default filter, apply the filter to the array ----    #    dz_dx, dz_dy = filter_a(a_s, filter=f_dxyz, cell_size=cell_size)    #    s = np.sqrt(dz_dx**2 + dz_dy**2)    if degrees:        s = np.rad2deg(np.arctan(s))    if not keepdims:        s = np.squeeze(s)    if verbose:        from textwrap import indent, dedent  # if not imported        p = "    "        args = ["Results for slope_a... ",                indent(str(a), p), indent(str(s), p)]        print(dedent(frmt).format(*args))    return s    def aspect_a(a, cell_size=1, flat=0.1):    """Return the aspect of a slope in degrees from North.    :Requires:    :--------    : a - an input 2d array. X and Y represent coordinates of the Z values    : flat - degree value, e.g. flat surface <= 0.05 deg    :        0.05 deg => 8.7e-04    0.10 deg => 1.7e-02                :    """    if not isinstance(flat, (int, float)):        flat = 0.1    a_s = stride(a, r_c=(3,3))    if a_s.ndim < 4:        new_shape = (1,) * (4-len(a_s.shape)) + a_s.shape        a_s = a_s.reshape(new_shape)    r = a_s.shape[0]    c = a_s.shape[1]    f_dxyz = np.array([[1, 2, 1], [2, 0, 2], [1, 2, 1]], dtype="float64")    a_s = a_s * f_dxyz    # filter the array, using default filter f_dxyz above    #    dz_dx, dz_dy = filter_a(a_s, filter=f_dxyz, cell_size=1)    #    asp = np.arctan2(dz_dy, -dz_dx)     # relative to East    s = np.sqrt((dz_dx*cell_size)**2 + (dz_dy*cell_size)**2)    # get the slope    asp = np.rad2deg(asp)    asp = np.mod((450.0 - asp), 360.)   # simplest way to get azimuth    asp = np.where(s <= flat, -1, asp)    return aspdef _demo():    """Demonstration of calculations    :    """    frmt = """    :------------------------------------------------------------------    :{}    :input array    {}    :slope values    {}    :zero?    {}    :aspect values    {}    :    :------------------------------------------------------------------    """    p = "    "    t = 1.0e-12    d = [[0, 0, 0, 0, t, t, t, 0, 0, 0, 0, 0, 0, 4, 4, 4, 2, 2, 2],         [0, 0, 0, 0, 0, 0, 0, 0, 0, 0, t, t, t, 4, 4, 4, 2, 2, 2],         [0, 0, 0, 0, 0, 0, 0, t, t, t, 0, 0, 0, 4, 4, 4, 2, 2, 2]]    a = np.array(d, dtype='float64')    sl = slope_a(a, cell_size=5, degrees=True, verbose=False, keepdims=False)    sboo = np.where(sl==0.0, " T", " F")    asp = aspect_a(a)    z = np.ma.asarray(a)    z.mask = (z==t)    args = ["Surface properties",            z,            indent(str(sl), p),            indent(str(sboo), p),            indent(str(asp), p)]    for i in d: print(i)    print(dedent(frmt).format(*args))    return a, sl, asp, d # ---------------------------------------------------------------------if __name__ == "__main__":    """Main section...   """    #print("Script... {}".format(script))    a, sl, asp, d = _demo()  ``

That's all for now.

Once slope and aspect are obtained, the hillshade can be derived.  The code is fairly self-explanatory

``def hillshade_a(a, cell_size=1, sun_azim=315, sun_elev=45, degrees=True):    """Hillshade calculation as outlined in Burrough and implemented by    : esri in ArcMap and ArcGIS Pro.  All measures in radians.    : z, az => sun's zenith angle an azimuth    : sl, asp => surface properties    : 255.0 * ((cos(z) * cos(sl)) + (sin(z) * sin(sl) * cos(az-asp)))    :    """    s_azi = np.deg2rad(sun_azim)    s_elev = np.deg2rad(90.0 - sun_elev)    a_a = aspect_a(a, degrees=False)    a_s = slope_a(a, cell_size=cell_size, degrees=False)    out = 255*((np.cos(s_elev) * np.cos(a_s)) +               (np.sin(s_elev) * np.sin(a_s) * np.cos(s_azi - a_a)))    out = np.where(out < 0, 0, out)    return out.astype('int')``

Other terrain derivates will be added soon.

# Free Frequency ... join the cause

Posted by Dan_Patterson Mar 3, 2016

Free Frequency...

Announced  2017-04-15  ***Frequency is freed... version for ArcGIS Pro

Created        2016-03-02

Modified       2016-12-20  *** the shackles remain in 10.5 make it your resolution in 2017 to join the cause

References   many

Come on. That is lame. This type of question comes up time and again.

Select your options from the following request.

------------------------------

I have ...

• two   - several   - many

columns containing class information with

• the same   - different data types

and I want to combine them to produce a unique classification scheme and

• count the number in each class
• calculate some statistical value,
• graph it.

------------------------------

Get the picture?  Sound familiar?

I will show the results of doing this type of analysis and provide an attachment for you to do it.

In ArcGIS, the Frequency tool produces these results

Now these results look similar...in fact identical, but created without using the frequency tool... but using tools provided within ArcMap itself.

Input arraySorted arraySummary

A     B

'a',   'c'

'b',   'a'

'c',   'b'

'c',   'b'

'b',   'a'

'a',   'c'

'b',   'a'

'c',   'b'

'b',   'a'

'a',   'c'

'a',   'a'

'c',   'b'

A     B    class row

'a',   'a'     0       0

'a',   'c'     1       1

'a',   'c'     1

'a',   'c'     1

'b',   'a'     2       4

'b',   'a'     2

'b',   'a'     2

'b',   'a'     2

'c',   'b'     3       8

'c',   'b'     3

'c',   'b'     3

'c',   'b'     3

combinations    class

( 'a',   'a' )              0

( 'a',   'c' )              1

( 'b',   'a' )              2

( 'c',   'b' )              3

first occurence  0, 1, 4, 8

reclassed list

0 1 1 1 2 2 2 2 3 3 3 3

class     0   1   2   3   4

count    1    3   4   4   -

Now to produce the results above, you could

• build an SQL style query using the unique combinations of inputs for each field   (ie if A =='a' and B == 'a': then...)
• concatenate the values in each field (accounting for differences in data types), get the unique combinations

These can be determined using:

• built in tools in arcmap ( field calculator, the Calculate Field, Frequency and Summarize tools).
• scripting, using python with searchcursors or arcobjects.  You could even make a fancy interface to collect and present the results.

You can even produce graphs.

So... how was it done?  ... it is in the attachment.

Join the cause... frequently used tools should be part of the standard arsenal.

# Basic fancy formats ....

Posted by Dan_Patterson Mar 1, 2016

Basic fancy formats...

Created:   2016-02-28

Modified: 2016-08-13

Attachments:

Script attached in zip file below.

References:

This is a work in progress.

I am tired of reinventing the wheel every time I want to format output so it looks appealing and conveys the information I want.  To this end, I am going to present what I have... but I will do this in increments.

I will not provide commentary... I don't want to cloud the issue with rules and specifics.  I use a monospaced font to ensure that the output is what you would expect in any python IDE, and to control vertical alignment within a pure text environment.  Asample script is attached so you can experiment with your own ideas.

Consult the script header for further information.  The script also contains some format tips that arise from formatting script... which I have posted on previously.

Sample output (courier new 10pt)

This is the only font that will maintain monospacing from the available font family.  I formatted using the "Plain" syntax highlighter so that line numbers were created for reference purposes.

``Basic formatting....    :  |_{ [name/num] [!s, !r, none] [: format_spec]_|}    :  |_{ vertical bar to denote start, _ optional spaces    :  }_| vertical bar to denote end, _ optional spaces    : ------------------------------------    :   [name/num]  = empty, name, number    :   [!s or !r]  = str or repr format or nothing    :   [format]    = see examples in output 1   |{0:_>15.0f}|   - lead _, right, integer           |_________123456| 2   |{0:_>15.3f}|   - lead _, right, float             |_____123456.012| 3   |{0:_>+15.3f}|  - lead _, right, req. sign, float  |____+123456.012| 4   |{0:_> 15.3f}|  - lead _, right, opt. sign, float  |____ 123456.012| 5   |{0:> 15.3f}|   - no lead, right, opt. sign, float |     123456.012| 6   |{0:>15.3f}|    - whatever, float                  |     123456.012| 7   |{0: <15.3f}|   - left, float, trailing space(s)   |123456.012     | 8   |{0: < 15.3f}|  - left, space(s), float, spaces    | 123456.012    | 9   |{0:_< 15.3f}|  - left, space(s), float, __        | 123456.012____| 10  |{0!s:<15}|     - left, string                     |123456.01234   | 11  |{0!s:*<15}|    - left, string space(s)            |123456.01234***| 12  |{0!s:*^15}|    - center, lead/trail               |*123456.01234**| 13  |_{0!s: ^13}_|  - lead, center, trail              |_123456.01234 _| 14  |_{0!s:*<}_|    - no size, lead ignored, left      |_123456.01234_| 15  |_{0: >5f}_|    - float, undersized                |_123456.012340_|Fancy formatting....Format elements are generated from field widths and content.Some examples for ... 1234.5 ...widths.... [6, 8, 10, 12]building..     f = [ "|_{{0!s:^{}}}_|".format(width) for width in widths ]    txt = "".join(i for i in f)    print(txt.format(1234.5))formats... ['|_{0!s:^6}_|', '|_{0!s:^8}_|', '|_{0!s:^10}_|', '|_{0!s:^12}_|']result.... |_1234.5_||_ 1234.5 _||_  1234.5  _||_   1234.5   _|``

More to come....

# Fancy, fancy formatting

Posted by Dan_Patterson Mar 1, 2016

Created:   2016-03-01

Modified:  2016-08-13

Attachments:

Script attached in zip file below.

References:

This is a work in progress

This example shows how to generate column formats from the data itself and how to generate a header from the tabular information.  The simplest way to obtain a useful table for manipulation in other software is to use the FeatureclassToNumPyArray or TableToNumPyArray methods in arcpy.  A sample output for such a table is shown in the example.

In the example below, the following procedure was used:

• the first row of the table is used to obtain the column widths and data type
• an appropriate format string was used to create the column structure... NOTE. as in the previous example a vertical bar, |, is used to denote the column positional start and the underscore, _, to denote preceding or trailing spaces.
• the format string and the number of columns in the table creates the line format,
• combining that information with column width gives a resultant column format
• an option to use the maximum size between the column name and data size is used to modify the values of column width.
• finally... a column header...generated from the input data... is produced and the line is printed.

The next example will delve into how to do this all in one swoop giving the option to use column widths based upon data size with the columns and/or whether to maintain column size, alignment, record delimiters and whether to include ancillary information related to the table.

Format demo

``Input array....:.... [(0, [300000.0, 5025000.0], 12345, 1.2345, b'string', 'unicode')]format string: |_{{!s:<{}}}_column widths: [2, 21, 5, 6, 9, 7]column type:   ['int32', 'ndarray', 'int32', 'float64', 'bytes_', 'str_']data dtype:    ['i', 'f', 'i', 'f', 'S', 'U']line format....  |_{{!s:<{}}}_|_{{!s:<{}}}_|_{{!s:<{}}}_|_{{!s:<{}}}_|_{{!s:<{}}}_|_{{!s:<{}}}_resultant:..|_{!s:<2}_|_{!s:<21}_|_{!s:<5}_|_{!s:<6}_|_{!s:<9}_|_{!s:<7}_Column header... |.ID.|..........XY...........|..C_0..|..C_1...|....C_2....|...C_3...Output line..... |_0 _|_[  300000.  5025000.]_|_12345_|_1.2345_|_b'string'_|_unicode_``

More later...

By date: By tag: