Skip navigation
All Places > Applications Prototype Lab > Blog

In my previous post about the Combined Field Statistics addin for ArcMap I mentioned that it was inspired from another addin toolkit I am currently developing which analyzes and displays rates of disease occurrence in a map.  In this post I want to share with you another addin project which was borne out of the disease mapping project.  This one is called Diverging Color Ramp.  This addin enables you to intelligently apply a dichromatic color ramp to feature layers containing points, lines, or polygons in ArcMap 10.4 and above. It is somewhat similar in capability to the smart mapping tools in ArcGIS Online and is useful if you need to display data with values that diverge from a specific threshold value which you define.  The screenshot at the top of this post shows a map layer of population density where the colors diverge from an arbitrarily defined threshold value of 40,000/square mile.   Other examples of threshold values might include 1.0 for ratios, 0.0 for values which represent gain or loss from an average value, or any value which represents a significant threshold of interest in your data.

This tool gives you a great deal of control over how a feature layer is rendered.  It provides a pair of default color ramps, one for for values above the threshold and another for values below the threshold, as well as a separate threshold color symbol.  If you don't like the colors you can change them to anything you like by accessing them from the Style Manager in ArcMap.  You can also choose the classification method and number of classes, and even choose the number classes which are generated for values above and below the threshold.  This addin also has a "secret" trick up it's sleeve.  When it generates the symbols for each class in the map layer's renderer, it actually makes a clone of the symbol for the first class in the current renderer to use as a template for the rest of the symbols.  After that is simply sets the background color of each symbols  to colors pulled from the color ramps.  This makes it easy to quickly apply global changes to other properties in the symbols, such as outline color or fill pattern, since you only have to make the change in the first symbol and then apply it as a template to each of the symbol classes with a single button click.  This is a fun tool to use since it makes it easy to try out different ways to visualize your data.  I hope you enjoy using it.

Download links:Diverging Color Ramp (addin + documentation)Diverging Color Ramp (source code + documentation)
Combined-Field-Statistics-GUI.pngI'm currently working on a new add-in toolkit for ArcMap 10.4 to analyze and display rates of disease occurrence in a map.  This is a work-in-progress and I will provide more details about it in the coming months.  But in the meantime I want to share with you the result of a smaller project that was borne out of the disease mapping project.

One of the requirements for the disease mapping project is the ability to calculate the sums of values from age group columns in a feature layer.  Generally, the tool of choice for a task like this is the Summary Statistics geoprocessing tool in ArcToolBox.  However, I also needed the ability to calculate sums from the combined values in multiple age group columns.  For example, to calculate the total number of individuals in columns for ages 0 -4 and 5 - 9, to obtain an aggregate total for ages 0 - 9.

It occurred to me that this capability could have broader applications if the tool could also calculate the full set of statistics provided by the Summary Statistics tool.  So I decided to build a Combined-Field-Statistics-Result.pngmore generic stand-alone addin tool in parallel with the disease mapping toolkit.  The result of this effort is the Combined Field Statistics Addin.  This tool is very simple to use and includes a detailed user guide to get you started.  It also generates an output log so you can keep track of how the tool was configured to generate a particular output.  If this capability sounds useful to you give it a try and let me know what you think in the comments!

Download links:Combined Field Statistics (addin + documentation)Combined Field Statistics (source code + documentation)


The source code for Dimension Explorer, an add-in for exploring time-aware and multidimensional data, is now available for download.  Version 1.11 is a minor update from version 1.1 and is built specifically for ArcMap 10.4.  This version features tighter integration with the ArcMap Table of Contents window - as layers are added, removed or changed, the  contents of the layer selection list in the Settings window are updated accordingly.

Update - I've had reports that the layer selection list is not getting updated consistently as layers are added, removed, or had their properties updated in the map document.  Until this issue is resolved, I recommend users of ArcMap 10.3.1 and 10.4 use version 1.1 of Dimension Explorer instead.  I apologize for any inconvenience this may have caused.

Download links:

Dimension Explorer 1.11 (addin and documentation)

Dimension Explorer 1.11 (source code and documentation)

Dimension Explorer 1.1 (original blog post)


The parallel fences generated in the X and Y directions by two separate runs of the Parallel Fences tool.

Note:  The 3D Fences Toolbox was updated Feb 28, 2017.  

This article introduces and discusses a geoprocessing toolbox that can perform geostatistics on vertical slices of three dimensional point clouds.Click here to download the toolbox.

For years the users of ArcGIS and its Geostatistical Analyst extension have been able to perform sophisticated geostatistical interpolation of the data samples in two dimensions. With that, the creation of continuous maps of any phenomenon that was measured only at selected locations was feasible. With the growing popularity of analysis in 3D and the availability of 3D data, the necessity of an option to perform geostatistics on 3D data is in demand. Studies of atmospheric cross-sections, geologic profiles, and bathymetric transects became an integral part of GIS. In each one of these three classical elements (air, earth and water), we can measure natural phenomena, such as the gradient of air temperature, the content of an ore at various depth of the geologic strata, or the salinity of the ocean along adjacent vertical transect lines.

In GIS we analyze not only the natural phenomena, but also those that are man-made. Human impacts on air, earth, and water are also measured. Among those human contributions to the environment that could be analyzed in 3D are plumes of air pollution and oil spills. The later may migrate down in the ground and be moved by the ground water drift, or migrate from the bottom of an ocean up to its surface while sometimes being dragged by ocean currents. Having an insight into such occurrences can greatly increase our understanding of the phenomena. This was the motivation for creating the 3DFences Toolbox.

A slice is a vertical subset of the 3D data analogous to a slice of bread from a loaf. While typically narrow in one dimension it still is a 3D object. A fence is a 2D representation of a slice of 3D data. All points which belong to a slice are projected, or pressed onto a 2D plane. The term of a fence diagram, or a fence, is used in geology to illustrate a cross section of geologic strata generated from an interpolation of the data coming from a linear array of vertical drillings. The equivalent of a fence in the atmospheric sciences is usually referred to as a curtain.

Top view of a slice of points shown in purple, and the resulting fence presented as a colored line in the center of the input points.


Side view of the input slice points located on one side of the resulting fence. The points are the measurements of oil in sea water after an oil spill.

To be explicit, the tools in the 3D Fences toolbox do not perform 3D interpolation. The approach offered by the 3DFences Toolbox does not implement geostatistical analysis directly on the vertical slices of the 3D data. Instead, the tools transform a slice of the 3D data, with its X, Y, Z, and the measure of a phenomenon component by rotating it by 90° to a horizontal 2D plane. The geostatistical interpolation method of Empirical Bayesian Kriging (EBK) is performed on these points producing either the geostatistical surface of Prediction, which is a continuous map of the concentration or intensity of something, or a map of the Prediction Standard Error, which can be explained as a map of a degree of confidence in the Prediction map at each location of the map. The resulting output is converted to a point dataset where the points represent the interpolated value at the center of the raster cells. The points are then placed back into the original coordinate space as a regular matrix of points resembling a fence. The fence is positioned in the center of the selected points of the initial slice, when displayed in ArcScene or ArcGIS Pro. The raster is converted to a point dataset because ArcGIS does not currently support display of raster data as a vertical plane and point symbology options provide added flexibility in displaying results.

Any of the ArcGIS standard or geostatistical point interpolation tools could have been implemented in the tools. For the prototype, we have chosen the Empirical Bayesian Kriging (EBK) geostatistical method known for its best fitting default parameters and accurate predictions.3.png

The 3DFences toolbox consists of three separate tools to support different methods of generating fences. The Parallel Fences tool can generate sets of parallel fences in the directions that are related to either longitudes, latitudes or depths.  In other words, the output sets of parallel fences stretch from N to S, W to E, or through the Z dimension.  The number of these fences in each set is determined by the user. All of the tools support selection sets to create fences from a subset of sample point features.

A fence created along digitized "S" shape.

The Interactive Fences tool can generate fences based on lines digitized on the map. The user sets the buffer distance from the digitized line. All points that are located within the buffer will be used for the geostatistical analysis. The user may digitize multiple lines and even self-intersecting lines with many vertices. The third tool, called Feature based Fences, creates fences based on existing features in a polyline feature. In this case the fence shape is determined by the existing feature(s) and extends through the Z dimension of the selected sample points. As an example, it might be applied to investigate for oil leaks above an oil pipeline placed on the sea floor.

All of the tools contain options enabling the user to determine the minimum number of sample points and fence size required to generate a reasonable geostatistical surface. The tools are also time aware. If the sample data contains a date-time field and the option is enabled, a fence will be generated for each time interval if the samples for that interval and location meet the minimum requirements set by the user. The resulting fences representing consecutive time windows are positioned at the same locations. Thus, to enable better visual analysis, these should be displayed as time animations.

An example of an atmospheric curtain created from backscatter data acquired by NASA from its Calipso satellite orbiting at 32 km above the Earth's surface.

Contributed by Bob G. and Witold F.

A new version of Dimension Explorer is now available for download.  Dimension Explorer 1.1 is an addin tool for exploring time-aware and multidimensional map data in ArcMap 10.3 and above.

Here is what's new in version 1.1:

    • map layers created with the Make NetCDF Raster Layer and Make NetCDF Feature Layer geoprocessing tools are now supported.


    • map layers with vertical dimension values defined as ranges (e.g. 0-10 meters, 5-50 meters, etc) are now supported.


    • export a map animation in ArcMap to a series of still images to create video animations with applications such as Windows Movie Maker and ffmpeg.


  • various bug fixes and optimizations.

Here is a video animation of the minimum Arctic sea ice extent for the years 1979 - 2014.  I created it with Windows Movie Maker using still images exported via Dimension Explorer 1.1.  The map includes a time-aware layer created with the Make NetCDF Feature Layer geoprocessing tool with data from the NOAA.

Dimension Explorer 1.1 can be downloaded here


If you are looking for data to get started with Dimension Explorer,  the NOAA Earth System Research Laboratory, Physical Sciences Division, (NOAA/ESRL PSD) has many large collections of spatial scientific data for climate and oceans in NetCDF format.  I recommend starting with their gridded climate datasets.   You can add most of their datasets to ArcMap using the Make NetCDF Raster Layer geoprocessing tool.  If you get the error "One or both dimensions have variable spacing in their coordinates", use the Make NetCDF Feature Layer geoprocessing tool instead.  If the datasets are stored as multiple files representing the data at different times, use the Mosaic Dataset to temporally aggregate the files using the NetCDF Raster Type.  Finally, if you are working with temporal data of any type, be sure to time-enable the layer in ArcMap.

Ürümqi in China is the remotest location on Earth, geographically speaking of course. This discovery and the analysis behind it are discussed in a recently published story map entitled Poles of Inaccessibility.

Vilhjalmur Stefansson, an Icelandic explorer, introduced the world to the concept of inaccessibility with his 1920's computation of the Arctic's pole of inaccessible. Story map authors Dr Witold Frączek and Mr Lenny Kneller recomputed this Arctic location and inaccessible locations in six continents. Frączek and Kneller computed and compared remote locations using geodesic and planar computations. Differences were small with the exception of the Eurasian continent which has a variation of approximately 11 kilometers.

The authors discovered that South America was essentially bi-polar with respect to inaccessibility. While both locations are located in Brazil, and separated by 1,400 km, the difference in inaccessibility was less than a kilometer. It is conceivable that the order of remoteness may change with the interpolation and generalization of the coastline.

Lastly, I would encourage you to take a moment to read and view Frączek and Kneller's Poles of Inaccessibility.

DimensionExplorer4.pngUpdate - a more recent version of Dimension Explorer is now available.  Click here for more information.

Dimension Explorer, an addin tool for ArcMap, has just been released by the Esri Applications Prototype Lab!

Dimension Explorer 1.0 makes it easier to work with  time-aware and multidimensional data in ArcMap 10.3 by providing slider controls for navigation.  It works by retrieving dimensional information from a map layer to build an interactive dimensional model that can be edited and saved in the map document.  Dimension Explorer is the successor to the Timeliner addin for ArcMap, which also works in ArcMap 10.3 and can be downloaded here.

Click here to download Dimension Explorer

With the 10.3 release of ArcGIS, the mosaic dataset now supports multidimensional data in NetCDF, GRIB, and HDF format.  Dimension Explorer supports map layers based on mosaic datasets and image services which are time-aware or multidimensional, and time-aware feature classes.

Download Dimension Explorer and let us know what you think of it in the comments!


Hydro Hierarchy is an experimental web application for visualizing the US river network.

Click here to view the live application.

Source code is available on agol and github.

There are approximately a quarter of a million rivers in the United States, but only 2,500 are displayed in this application.  This subset represents streams with a Strahler stream order classification of four or greater.  The stream data used by this application is derived from the USGS's National Hydrographic Dataset and has undergone significant spatial editing.  Streams geometries have been adjusted to ensure connectivity, generalized for small scale mapping and re-oriented in the direction of flow.

River flow data was acquired from the USGS's WaterWatch website.  Each river segment is attributed with the average flow for each month in 2014 and the ten year monthly average.  Computed values, in cubic feet per second, represent the flow at the downstream end of each river.  Flow data is displayed as a column chart on the left hand side of the browser window whenever the user's mouse passes over a stream.

The preview animated image at the beginning of this post may look sped up.  It is not.  Upstream and downstream rivers are highlighted in real time as the user moves his or her cursor over the hydrologic network.  This performance is achieved using connectivity information loaded when the application first starts from this file.  The file was creating in ArcMap from a network dataset that included the river feature class.  Using this script, connectivity information for each network node was extracted and arranged into a hierarchical data structure.

The radial and column charts on the left hand side of the application are generated using the D3 graphics library.  The column chart displays 2014 flow data for any river segment that is directly below the user's mouse.  The horizontal red line represents the ten year mean monthly flow.  Note that for most river segments, only one or two months ever exceeded the ten year average.  This is indicative of 2014's drought, at least with respect to river flows over the past decade.Contributed by Richie C. and Witold F.


Solar Eclipse Finder is a JavaScript-based web application that displays past and future solar eclipses that pass through a user defined location.  The source data is published as an AGOL hosted service with the paths of 905 solar eclipses from 1601 to 2200.  The eclipse paths were prepared by Michael Zeiler from data courtesy of Xavier Jubier.

The live application is available here.

Source code is available on agol and github.

Originally developed 2½ years ago as a Silverlight-based web application (see blog posting here), we wanted to confirm that the same performance and advanced symbology is achievable today with HTML5/JavaScript in modern browsers.

jQuery & Bootstrap

jQuery is a JavaScript framework for DOM manipulating.  It is important to note that jQuery is not a prerequisite for mapping apps using Esri's ArcGIS API for JavaScipt. It is however a prerequisite of many third party JavaScript libraries like Bootstrap, a popular user interface framework.  This application uses Bootstap's popover tooltips in the fly-out attribute window and its modal dialog during start-up.


The tapered symbol used by eclipse shadow paths is achieved using a linear gradient fill.  Linear gradient fills are not supported by ArcGIS API for JavaScript.  However linear gradient fills are supported by SVG, the underlying technology used by Esri's JavaScript API for renderering vectors.  We used Mike Bostock's D3.js JavaScript library to insert and apply linear gradient fills directly to the map's embedded SVG node.


Updating this application was a two step process.  First the eclipse dataset was republished as an AGOL hosted feature service and, second, the app was rewritten in HTML/JS.  Both tasks were relatively effortless and only took a couple of days in total.



Posted by rcarmichael-esristaff Employee Nov 18, 2014


GeoJigsaw is a community driven geographic jigsaw puzzle.  If you are feeling creative you can create a puzzle and share it with the puzzle community.  If you are feeling competitive, try beating the high score on someone else's puzzle.

Click here to view the live application.(works best in Microsoft Internet Explorer and Mozilla Firefox)

Click here to download the source code from ArcGIS Online.  A simpler version of this application is available on github here.

GeoJigsaw is a JavaScript-based web application inspired by a Silverlight application developed about two years ago called Puzzle Map.  Unlike a Map Quiz published geogame that uses Facebook, in this application we wanted to explore anonymous collaboration.  That is, anyone can anonymously create, share and play puzzles.


In the app developed two years ago, the puzzle design was static.  In this application we wanted to offer puzzles of varying difficulty and size so we needed to implement a technique of dynamic puzzle creation.  After a little research we discovered this example of a voronoi tessellation using D3.  D3's voronoi implementation and associated SVG-based visualization library are the basis of this game.

Unlike the D3 sample, our app did not use completely randomized points.  If a user created or selected an "impossible" puzzle then a 10 by 10 grid of points is created and nudged slightly before being turned into 100 piece voronoi diagram using D3.  This was only part of the puzzle (excuse the pun), each piece needed the addition of one or more tabs.  Tab addition is essential to give the game its recognizable jigsaw look. Through a process of iteration, tabs are appended to any side of sufficient length and reversed if an opposing tab exists.SVG Filterspuzzle1-300x278.png

The finishing touch to give the puzzle a realistic look is the application of an inner bevel using a SVG filter.  SVG filters are hardware accelerated in Internet Explorer and Mozilla Firefox but not in Google Chrome.  Unfortunately the Chrome's software rendering of SVG filters makes complex puzzles almost unplayable.  This may change in future releases of Chrome.


Puzzles designs, ratings and scores are stored in ArcGIS Online (AGOL) hosted services.  We intended the application and associated services to be accessed by non-AGOL users.  This meant that AGOL user credentials could not be used to restrict access to prevent unanticipated malicious activity.  As such, we used the security model discussed in the previous post, that is, app registration and an intermediate web proxy.

Libraries Used



  • Bootstrap by Twitter IncA useful and comprehensive UI framework.  This application leveraged a subset of bootstrap pertaining to buttons and input elements.




  • iQuery by jQuery Foundataion IncA JavaScript framework for DOM manipulation and a foundation for many other frameworks.



This project demonstrations that modern browsers are more than capable of impressive visualizations without the need of plugins such as Silverlight or Flex.  We also wanted to experiment with anonymous game play, time will tell if the lack of user identification is an incentive or disincentive to play.  Good luck!


Map Quiz

Posted by rcarmichael-esristaff Employee Nov 17, 2014


Map Quiz is fun JavaScript-based geo-game developed by the Applications Prototype Lab. The game tests your geographic knowledge with six randomly selected satellite images. Prove that you are a geo-genius at home, school or in your workplace!

Click here to access the live application.

Click here to download the source code.


This project is a port of an application initially developed as a Windows Store application.  The purpose of the application is to present a fun geography-focused game (or "geo-game") based on Esri technology.  The application is primarily based on Esri's ArcGIS API for JavaScript and is powered by ArcGIS Online (AGOL) hosted services for scoring and questions.  Originally the Windows Store based application used AGOL-based authentically, in the JavaScript edition we decided to use Facebook authentication in an effort to appeal to a large audience.

Why Facebook?

One of the motivations of gaming is the thrill of competition, whether personally or with others.  In order to identify users so that they could monitor their scores, or those of others, there needed to be some sort of authentication.  The obviously choice is AGOL.  AGOL is the perfect choice when collaboration is needed within the GIS community but a large percentage of the target audience of this application may not have AGOL credentials. Arguably, Facebook may not be completely ubiquitous but certainly common.  As such, we decided to use Facebook and the Facebook API to authenticate users.  To ease privacy concerns this application requests and displays only a small subset of profile information, specifically, a person's profile picture, last name and initial of first name.

Controlling Access to Hosted Services

The game's questions, answers and scores are stored in two ArcGIS Online hosted feature services.  Hosted services are easy to create and allow for powerful spatial queries.  However access to hosted service is either unrestricted, or confined to an organization or to users that belong to specific groups.  Because this game is intended for non-AGOL users, we needed a way of restricting access to the hosted services to just the Map Quiz web application.

This was achieved by registering the app on AGOL.  The resulting app id and secret were then used in a web proxy that granted exclusive access to the hosted services to only the Map Quiz web application.  The proxy and instructions how to implement it are here.


The spinning map on the landing page or the gradual zooming of each question is achieved using CSS3 animation and the animo JavaScript library (see above).  On modern browsers the animation effects are smooth and consistent.  With respect to the spinning map, map edges needed to be expanded outwards to avoid white patches appearing in the corners.  To avoid this we applied negative margins using a Pythagorean computation.  One disadvantage of the spinning map is that image seams are occasionally observed.

Silverlight vs. JavaScript

As a reformed Silverlight developer I have been pleasantly surprised with the performance and capabilities of JavaScript-based web applications.  To date, I have yet to encounter any Silverlight capability that could not be achieved with HTML5/CSS3.  The biggest issue has been the paradigm shift from Silverlight's large well-documented framework to the necessity of working with a half dozen lightly documented open source libraries.

Libraries Used

As mentioned above, this project is based on a few open source libraries, these libraries are listed and described below.

  • animo.js by Daniel RafteryThis is a small but powerful library that provides programmatic creation and management of CSS animations.



  • Bootstrap by Twitter IncA useful and comprehensive UI framework.  This application leveraged a subset of bootstrap pertaining to buttons and input elements.



  • jQuery by jQuery Foundation IncA JavaScript framework for DOM manipulation and a foundation for many other frameworks.


This  project was a fun and enjoyable exercise and we hope the reader is equally entertained by this application.  We showed that geo-games can be easily created with Esri's JavaScript framework and cloud services together with common libraries like jQuery, bootstrap and Facebook.


Map Lens

Posted by rcarmichael-esristaff Employee Nov 13, 2014


Map Lens is a sample JavaScript-based web application that demonstrates lensing or perhaps a better description is "draggable map insets".  Lensing offers many benefits over traditional swiping which is commonly used to horizontally transition between two web maps, for example, before and after a hurricane imagery.  In comparison, lensing offers the user an unlimited number of rectangular map overlays, for example, one lens per Esri basemap as shown above.

Click here for the live application.

The source code is available on agol and github.

Lensing is achieved with the ArcGIS API for JavaScript and leverages jQuery and jQueryUI, specifically the draggable and resizeable methods.  The biggest challenge developing this sample was the handling (or suppressing) of the various mouse and navigation events.

Map lensing is not without its disadvantages.  Maps embedded in each lens cannot contain dynamic content, maps ideally should only contain tiled map or image service.  This is a performance consideration due to the number of navigation events invoked during lens interaction.

Image services are not only for serving imagery; they can also perform dynamic pixel-level analysis on multiple overlapping raster datasets using chained raster functions.  Image services deliver blazing fast performance with pre-processed source imagery, especially when it is served from a tile cache.  Dynamic image services, on the other hand, may not respond as quickly because of the additional processing demands they place on the server.  High performance is important for dynamic image services because the data is re-processed automatically each time the user pans or zooms the map.  Dynamic image services typically produce sub-second responses for simple types of analysis.  But more complex analysis may take much longer depending on the complexity of the processing chain and the number of input datasets involved.  How can we get better performance in those situations?  To answer that question I ran a series of tests to see how the following factors affect the performance of dynamic image services:


    • Map scale and source data resolution
    • Resampling method
    • Source data format and compression
    • Project on-the-fly
    • Request size

In this article I present the results of those tests along with some suggestions for maximizing performance.  My testing machine is a desktop computer running Windows 7 SP1 and ArcGIS 10.2.1 with 18GB of RAM and a quad-core Intel Xeon W3550 processor running at 3 GHZ.  The test data was stored on an otherwise empty 2 TB SATA hard drive that I defragmented and consolidated prior to testing.   The tests were configured to determine the average response times of services under various conditions.  By “response time” I mean the time it takes a service to retrieve the source data, process it, and transmit an output image.  Transmission time was minimized by running the testing application directly on the server machine.


This information is written with the intermediate to advanced GIS user in mind.  I assume the reader has a general understanding of image services, raster data and analysis, raster functions, geoprocessing, mosaic datasets, map projections, and map service caching.


Map Scale and Source Data Resolution


The pixels that are processed for analysis by dynamic image services are generally not identical to the pixels stored in the source datasets.  Instead, the source data pixels are first resampled on-the-fly to a new size based on the current scale of the map.  This formula shows the relationship between map scale and resampling size when the map units of the data are meters:


Resampled pixel size = map scale * 0.0254/96


The resampled pixel size is analogous to the “Analysis Cell Size” parameter in the Geoprocessing Framework and is sometimes referred to as the “pixel size of the request”.  As you zoom out to smaller map scales, the resampled pixel size increases until eventually the service resamples from the pixels in the pyramids.  Resampling from pyramids helps to keep the performance of the service relatively consistent over a range of map scales.



Chart 1. Performance of an image service that performs a binary overlay analysis over a range of map scales.

Performance still varies depending on map scale and typically looks similar to chart 1.    I generated these results using an application configured to simulate a single user panning the map 100 times in succession at specific map scales.  The chart shows the average time the service took to process and transmit the output images for different map scales.  This particular service was configured with a raster function template to perform a binary overlay analysis on eleven overlapping rasters in a mosaic dataset.  The pixel sizes of the source datasets ranged from 91.67 to 100 meters.  The raster function template was configured to return a binary result, where each output pixel is classified as either “suitable” or “unsuitable” based on the analysis parameters.


Take a look at the three points along the horizontal axis where the response time drops abruptly.  At those map scales the resampled pixel size is the same as the pixel sizes of the pyramids in the source data.  The processing time for resampling is the lowest at those scales because there is nearly a 1:1 match between source data pixels and resampled pixels.  Client applications which use this particular service will see dramatically faster response times if they are limited somehow to only those scales.  One way to do this is to use a tiled basemap layer.  Web mapping applications which use tiled basemaps are generally limited to only those map scales.  The most commonly used tiling scheme is the ArcGIS Online/Bing Maps/Google Maps tiling scheme (referred to hereafter as the “AGOL tiling scheme” for brevity).  The red-dotted vertical lines in the chart indicate the map scales for levels 7 – 12 of this tiling scheme.  Unfortunately those scales are not very close to the scales where this service performs it’s best.  There are two options for aligning source data pixels and tiling scheme scales:


    1. Build a custom basemap with a custom tiling scheme that matches the pixels sizes of the data.
    2. Sample or resample the data to a pixel size that matches the tiling scheme of the basemap.chart21.png

Chart 2. Performance of the binary overlay analysis service with different source data pixel sizes

The horizontal axis in chart 2 represents the "pixel size of the request" rather than map scale as in chart 1.  The orange graph shows the response times of another service configured identically to the first one in blue, except it uses source datasets that were up-sampled to 38 meter pixels using the Resample geoprocessing tool.  Up-sampling to 38 meters aligned the service’s fastest response times with the AGOL tiling scheme scales, which resulted in a significant decrease in processing time at those scales from approximately 1.5 seconds to about 0.5 seconds.  Furthermore, notice that performance is improved at nearly all scales except for the very largest.  This is most likely due to having all the source data at the same resolution (38m) instead of three (91.67m, 92.5m, 100m), and/or because the source data pixels are also aligned between datasets (accomplished by defining a common origin point for each resampled raster using the “Snap Raster” environment setting).


Admittedly, using the Resample tool to prepare data for analysis is not ideal because it results in second-generation data that is less accurate than the original.  This may be perfectly acceptable for applications intended to provide an initial survey-level analysis;however, it’s best to generate new first-generation data at the desired pixel size whenever possible.  For example, if you have access to land-class polygons, you could use them to generate a new first-generation raster dataset at the desired pixel size using the Polygon to Raster tool, rather than resampling an existing land-class raster dataset.


To determine by how much performance improved with 38 meter pixels, I calculated the percentage change in average response times for each scale and averaged the values over multiple scales.




Up-sampling the source data to 38 meter pixels reduced response times by  63.8% at the poorest-performing -target map scales!  38 meters was not my only option in this example.  I could have chosen a size that corresponded to one of the other tiling scheme scales.  The following table lists all the map scales of the AGOL tiling scheme, and the corresponding pixel sizes in units of meters, feet and Decimal Degrees.   The three columns on the right provide suggested values for sampling raster data.  These suggestions are not set in stone.  It’s not necessary to sample your data to exactly these recommended sizes.  The key is to choose a size that is slightly smaller than one of the sizes of the target tiling scheme scales.


Map Scales and Pixel Sizes for the ArcGIS Online/Bing Maps/Google Maps tiling scheme



By the way, matching the pixel sizes of your data with a basemap tiling scheme is also useful for workflows that involve static imagery overlaid onto a tiled basemap.  For those cases, you can build mosaic dataset overviews for viewing at smaller scales instead of raster pyramids.  One of the great things about mosaic dataset overviews is that you can define the base pixel size of overviews as well as the scale factor to match your target tiling scheme.  This way you don't have resample the source data to a new base pixel size in order to cater to any particular tiling scheme.

Resampling Method



The resampling method specified for an image service request also has an impact on performance.  The choice of which one to use should be based primarily on the type of data used in the analysis.  Chart 3 shows the performance of the binary overlay analysis service (with 38 meter data) with different resampling methods.


Chart 3. Response times of the binary overlay analysis service with different resampling methods




Bilinear resampling is the default method.  Here is how the response times for the other methods compared to bilinear averaged over the five map scales tested:


Raster Format


The storage format of the data can have a huge impact on performance.  For example, the response time of the binary overlay analysis service averaged over all map scales was 36% lower when the data was stored in the GeoTIFF format versus file geodatabase managed raster.  The Data Sources and Formats section of the  Image Management guide book recommends leaving the data in its original format unless it is in one of the slower-performing formats such as ASCII.  GeoTIFF with internal tiles is the recommended choice for reformatting because it provides fast access to the pixels for rectangular areas that cover only a subset of the entire file.Pixel Type and Compression


The pixel type determines the precision of the values stored in the data and can have a huge impact on performance.  In general, integer types are faster than floating-point types, and lower-precision types are faster than higher-precision types.  Compression of imagery can potentially increase or reduce performance depending on the situation.   For more information about the affect of compression on file size refer to the Image Management guide book section on Data Sources and Formats.  To assess the impact of pixel type and compression on the performance of data stored on a local hard drive, I tested a group of image services configured to perform an extremely intensive overlay analysis on 15 raster datasets.  The services were configured identically except for the pixel and compression types of the analysis data.  The tests were run at the map scale corresponding to the pixel size of the data.


Charts 5 & 6. Avg. response time and storage size vs. compression type for an image service that performs a complex overlay analysis



The following table shows the percentage change in response times with the reformatted datasets versus the original double-precision floating-point dataset.Table_pixeltype.png


On-the-fly Projection


On-the-fly projection is a very important feature of the ArcGIS platform.  It has saved GIS users like me countless hours of work by eliminating the need to ensure that every dataset in a map is stored in same coordinate system.  However, in some cases this flexibility and convenience may be costly when ultra-fast performance is required.   The following chart shows one of those cases.


Chart 8. Performance of a dynamic image service configured to perform a weighted overlay analysis.



Chart 8 shows the performance of a service which performs a weighted overlay analysis  on six datasets in an Albers projection.  The upper graph shows the performance when the output of the service is set to Web Mercator (Auxiliary Sphere).  The lower graph shows the performance when the output of the service is the same coordinate system as the data.  Performance without reprojection to Web Mercator improved by an average of 45% over all map scales.  This is a fairly extreme example.  The performance cost of reprojection is related to the mathematical complexity of the input and output projections.  Equal-area projections such as Albers are mathematically complex compared to cylindrical projections such as Mercator.  I have not run tests to prove this, but I expect that the performance cost of reprojection between two cylindrical projections such as UTM and Web Mercator would be less costly than seen in this example, and that a simple projection from geographic coordinates to Web Mercator would be even less costly.


To avoid on-the-fly projection you must ensure that all of your data is in the same coordinate system, including the basemap.  Most of the basemap services currently available from Esri on ArcGIS Online are in Web Mercator (auxiliary sphere).  So if you are going to use one of those basemaps, you would have to convert your data to the same coordinate system.  This can be an acceptable solution for some situations, but keep in mind that it results in second-generation data with less positional accuracy than the original source data.  Alternatively, you can create your own basemap in the same coordinate system as your data, and either publish it to an ArcGIS Server site or upload it to ArcGIS Online as a hosted map service.  If you take this approach, I recommend caching the basemap using a custom tiling scheme with scale levels that match the pixel sizes of your data.


Request Size


Request size is directly related to the size of the map window in the application and is specified in the REST API as the number of rows and columns of pixels in the output image.  To measure its impact on performance, I ran a series of tests at different request sizes on the weighted overlay analysis service that I used for the reprojection-on-the-fly tests.  I measured the average response times for request sizes ranging from 400x400 to 2200x2200, increasing at 100 pixel increments (e.g. 500x500, 600x600, etc…).  All of the tests were run at the map scale of 1:113386, which corresponds to the 30 meter pixel size of the source raster datasets.


Chart 9. Average response in MP/s for different request sizes for the weighted overlay service.


Chart 10. Average response time at different request sizes for the weighted overlay service.



Chart 9 shows that the throughput for this service levels off at a request size of approximately 1000x1000 pixels to about 1.5 – 1.6 MP/s.  Chart 10 shows that request size has a linear impact on performance.  This service is capable of providing sub-second response times for requests up to about 1,440,000 pixels, or a request size of 1200x1200.




Raster analysis can involve many stages of data processing and analysis.  Complex on-the-fly processing chains can place heavy processing loads on a server and contribute to sluggish performance.  Huge performance improvements can be achieved in some cases by pre-processing the data into a more efficient format for resampling and on-the-fly processing.


For applications which use tiled basemap layers, the greatest performance improvements are likely to be achieved by aligning the pixel sizes of the data with the scales of the basemap tiling scheme.  The section “Map Scales and Source Data Resolution” describes the theory behind this approach and provides a table with recommended pixel sizes for applications which use basemaps with the ArcGIS Online/Bing Maps/Google Maps tiling scheme.  Alternatively, developers can build basemaps with custom tiling schemes to align with the existing pixel sizes of the analysis data.


Another way to significantly reduce the processing load on a server in some cases is to avoid on-the-fly projection of the analysis data.  This is accomplished by ensuring that the basemap and the analysis data are in the same coordinate system.  The performance impact of on-the-fly projection varies depending on the input and output coordinate systems and is discussed in the section titled “On-the-fly Projection”.


The file format, pixel type, and compression type of the analysis data can also have a huge impact on performance.  GeoTIFF with internal tiles is recommended for situations where it’s necessary to re-format the data from a slower format.  Lower-precision pixel types give better performance than higher-precision types.  Pixel compression has the potential to either increase or decrease performance depending on the how the data is stored and accessed by the server.  These topics are discussed in the sections titled “Raster Format” and “Pixel Type and Compression”.


Client applications can also play a role in dynamic image service performance.  Service response times are the lowest when applications specify nearest neighbor resampling, followed by bilinear resampling.  And there is a direct relationship between service performance and the size of the map window in an application.  These topics are discussed in the sections titled “Resampling Method” and “Request Size”.


The source code is available here.


The landsat program recently celebrated its 40th birthday. Since the launch of the first satellite in 1972 the program has amassed more than 3,000,000 images.


The USGS has published this archive as a single ArcGIS Image Service called LandsatLook. The prototype described in this posting uses this service in an HTML5 web mapping application.


While the map view is perfect to identify an area of interest, it is not so useful for sorting through hundreds or thousands of overlapping images. This prototypes uses a control developed by LobsterPot called the PivotViewer to present a sortable collection of imagery. Using the map and pivotviewer together, the presenter in the video above was able to quickly find and download a recent cloud-free image of London.Contributed by Richie C.


This post discusses the release of a sample utility called Geometric Network Configuration Manager.  In some instances when it is necessary to temporarily remove a geometric network, this tool can recreate the geometric network from a backed definition file.

The add-in can be downloaded from here.

The source code is available here or here on github.

Configuration Manager has a very long lineage.  More than a decade ago the Prototype Lab published Geodatabase Designer for documenting geodatabases and exchanging schema.  Designer is now obsolete but fortunately much of its capabilities are now incorporated into ArcMap or other tools such as ArcGIS Diagrammer and XRay for ArcCatalog.

However the one feature from Designer that has yet to be replicated is the ability to save and restore geometric networks.  This useful if loading large amounts data or performing a schema change like switching a feature class from a simple edge to a complex edge.PrerequisitesLibraries Used
The following section will walkthrough the steps required to backup geometric network, remove it and then restore it.  The use case for this workflow could be for bulk data loading or transferring a geometric network from a test server to a production server.Walkthrough

Following the successful installation of the add-in.  Display the Geometric Network Tools toolbar and click the first button to launch the main dialog.  Drag and drop a geometric network into the configuration manager window.Capture3-300x203.png

The complete definition of the geometric network will be loaded into the dialog.  The four tabs below the ribbon make it is possible to review and, in some cases, modify classes, weights and connectivity rules.Capture5-300x180.png

The geometric network definition is current stored in memory and should really be saved to a file so that it can backed-up or restored at a later time.  Click the save or Save As button to export the definition to a file with a esriGeoNet extension.Capture6-300x180.png

Recreating a geometric network is just a matter of loading an esriGeoNet file and clicking Export.  The application will prompt the user for the name and location of the exported geometric network.  If the geometric network already exists, it will be overwritten.Known Issues:

  • The dialog that appears when the Export button is click may be hidden by the Configuration Manager window.  Either minimize or move the window to the side to continue with the export operation.

  • Add and removing of network classes and weights is currently not supported.

Filter Blog

By date: By tag: