BLOG
|
Hi Josh. The real credit goes to Dr Mora and his team of researchers for their eye-opening work. Esri does not have permissions to share the data. So I’m not in a position to share access to the data or the service. For that you would need to communicate with Dr. Mora directly for access to the data. Thank you for your response, and good luck with your research!
... View more
05-20-2022
01:01 PM
|
0
|
0
|
463
|
POST
|
I should have mentioned that I'm using ArcGIS Online Notebooks. I'm not sure what version of the API it uses. But it must be an earlier version since reproject is not available. I tried your second suggestion Rhea to set the out_spatial_reference property and that worked. Thanks Dan and Rhea for the quick responses!
... View more
12-30-2020
12:35 PM
|
0
|
0
|
1338
|
POST
|
Can I reproject the output of a raster function before saving it to a GIS? For example, this code applies an agricultural mask to a burnt area. burnt_areas_masked = colormap(set_null(rasters=[ag_mask, burnt_areas]), colormap=[[4, 115, 76, 0], [3, 168, 112, 0], [2, 230, 152, 0], [1, 255, 170, 0]]) The inputs 'ag_mask' and 'burnt_areas' are in-memory rasters generated by raster functions applied to imagery layer items in the GIS. The output 'burnt_areas_mask' inherits the spatial reference of 'ag_mask'. I guess this makes sense since 'ag_mask' is the first raster in the list of inputs. But 'ag_mask' is only a boolean layer and not actual the input data. So I need a way to set the spatial reference of the output 'burnt_areas_mask' to the spatial reference of 'burnt_areas'. The "reproject" function looks like a possible solution. https://developers.arcgis.com/python/api-reference/arcgis.raster.functions.html#reproject But I get a "name error" when I try to call it. Any suggestions?
... View more
12-29-2020
03:47 PM
|
1
|
5
|
1387
|
BLOG
|
I think you have the right idea. It's great that you are getting good performance at 1:10,000, even though one of the pyramid levels in your source data corresponds to a map scale of 1:10,500. That's the power of Image Server at work! Image Server is highly optimized out-of-the-box. For typical service configurations and usages, details like the pixel sizes of the source data are not significant factor in overall performance. However based on my results, I expect you would get even better performance if the pyramid in your data was at 9,500 instead of 10,500. Although it's possible that the improvement may not be easily perceived at the human level in your case. Putting this in perspective, the service I used to generate my results performed a server-side analysis on 11 different raster datasets. So the performance impact of resampling from 11 datasets with sub-optimal pixel sizes was greatly amplified. Typical image services do not require this amount of heavy lifting. But for cases like mine where performance was not satisfactory, this is one approach that can potentially bring huge gains in performance.
... View more
01-08-2020
07:35 PM
|
0
|
0
|
1642
|
POST
|
The portal I was trying to connect to was inside our firewall, while the notebook was in a GIS that is outside the firewall. Makes perfect sense now. Thanks!
... View more
11-01-2019
09:50 AM
|
1
|
0
|
1473
|
POST
|
I am able to create a connection to an online org from an ArcGIS Notebook. However when I try to connect to a different (not "home") enterprise GIS from an ArcGIS Notebook, I get this error: URLError: <urlopen error [Errno -2] Name or service not known> Is this supported?
... View more
10-31-2019
02:53 PM
|
0
|
2
|
1556
|
BLOG
|
In June of 2017 we began another collaboration with Dr. Camilo Mora of the University of Hawaii, Department of Geography. This came on the heels of our previous project with Dr. Mora to develop a web mapping application to display his team's research on climate change and deadly heatwaves. For their next project they had expanded their research to include multiple cumulative hazards to human health and well-being resulting from climate change. These hazards include increased fires, fresh water scarcity, deforestation, and several others. Their new research was recently published in the journal Nature Climate Change. Several news outlets published stories on their findings, including these from The New York Times, Le Monde, and Science et Avenir. For our part, the Applications Prototype Lab developed an interactive web mapping application to display their results. To view the application, click on the following image. To learn how to use the application, and about the research behind it, click on the links for "Help" and "Learn More" at the top of the application. In this post I'll share some of the technical details that went into the building of this application. The Source Data For each year of the study, 1956 - 2095, the research team constructed a series of global data sets for 11 climate-related hazards to human health and well-being. From those data sets they built a global cumulative hazards index for each year of the study. For information about their methods to generate these data sets, refer to their published article in Nature Climate Change. Each data set contains the simulated (historical) or projected (future) change in intensity of a particular hazard relative to a baseline year of 1955. For the years 2006 - 2095, hazards were projected under three different scenarios of greenhouse gas (GHG) emissions ranging from a worst-case business-as-usual scenario to a best-case scenario where humanity implements strong measures to reduce GHG emissions. In total, they produced 3828 unique global data sets of human hazards resulting from climate change. Data Pre-processing We received the data as CSV files which contained the hazard values on a latitude-longitude grid at a spatial resolution of 1.5 degrees. The CSV data format is useful for exchanging data between different software platforms. However, it is not a true spatial data format. So we imported the data from the CSV files into raster datasets. This is typically a two-step process where you first import the CSV files into point feature classes and then export the points to raster datasets. However, since the data values for the 11 hazards were not normalized to a common scale, we added a step to re-scale the values to a range of 0 - 1, according to the methodology of the research team, where: 0 equals no increase in hazard relative to the historical baseline value. 1 equals the value at the 95th percentile or greater of increased hazard between 1955 and 2095 for the "business-as-usual" GHG emissions scenario. With a spatial resolution of 1.5 degrees, each pixel in the output raster datasets are approximately 165 Km in width and height. This was too coarse for the web app, because the data for land-based hazards such as fire and deforestation extended quite a distance beyond the coastlines. So we added another processing step to up-sample each dataset by a factor of ten and remove the pixels from the land-based hazards raster datasets whose centers were outside of a 5 Km buffer of the coastlines. We automated the entire process with Python scripts, using geoprocessing tools to convert the source data from CSV to raster dataset, build the coastal buffer, and up-sample and clip the land raster datasets. To re-scale the data values, we used mathematical expressions. At the end of these efforts we had two collections of raster datasets - one for the 11 hazards indexes, and another for the cumulative hazards index. Data Publishing We built two mosaic datasets to organize and catalog each collection of raster datasets. From each mosaic dataset we published an image service to provide the web application with endpoints through which it could access the data. On the application, the map overlay layer is powered by the image service for the cumulative hazards index data. This layer is displayed in red with varying levels of transparency to indicate the level of cumulative hazards at any location. To support this type of rendering, we added a custom processing template to the image service's source mosaic dataset. The processing template uses the Stretch function to dynamically re-scale the floating-point data values in the source raster datasets to 8-bit integers, and the Attribute Table function to provide the color and transparency values of the exported image on a per-pixel basis. The Animations We built short video animations of the change in cumulative hazards over time using the Time and Animation Toolbars in ArcGIS Pro. You can access those animations from the application by clicking on the "Animations" link at the top of the application window. We used the cumulative hazards index image service as the data source of the animation. This service is time-aware, enabling us to define a timeline for the animations. Using the capabilities in the Animations Toolbar, we defined properties such as the time-step interval and duration, total duration, output format and resolution, and the various overlays such as the legend, watermarks, and dynamic text to display the year. We filtered the data in the image service by GHG emissions scenario using definition queries to create three separate animations of the change in cumulative hazards over time. The Web Application We built the web application using the ArcGIS API for JavaScript. To render the cumulative hazards map layer, the application requests the data from the image service in the LERC format. This format enables the application to get the color and transparency values for each pixel from the attribute table to build a client-side renderer for displaying the data. The chart that appears when you click on the map was built with the Dojo charting library. This chart is powered by the image service with the 11 individual human hazards index data. To access the hazards data, the web application uses the Identify function to get the values for each of the 11 hazards at the click location with a single web request to the service. In Summary Building this application gave us the opportunity to leverage many capabilities in the ArcGIS platform that are well suited for scientific analysis and display. If you are inspired to build similar applications, then I hope this post provides you with some useful hints. If you have any technical questions, add them into the comments and I'll try to answer them. I hope this application helps to extend the reach of this important research as humanity seeks to understand the current and projected future impacts of climate change.
... View more
11-30-2018
05:45 PM
|
3
|
2
|
2294
|
BLOG
|
Every now and then a really unique and out-of-the-box idea comes our way that expands our conceptions about the possible applications of the ArcGIS platform. This was one of those ideas. Could GIS be used to map the human body? More specifically, could we use CityEngine to visualize the progress of physical therapy for our friend and Esri colleague Pat Dolan of the Solutions team? Pat was eager to try this out, and he provided a table of measurements taken by his physical therapist to track his ability to grip and extend his fingers over time. With the help of the CityEngine team, we developed a 3D model of a hand, and used CityEngine rules to apply Pat's hand measurements to the model. We thought it would be fun to show a train station in a city that would magically transform into a hand. Our hand model is not quite anatomically correct, but it has all the digits and they are moveable! Click the image above to view a short video of this project. Pat and I showed this application, and others, at the 2017 Esri Health and Human Services GIS Conference in Redlands. Click here to view a video of that presentation.
... View more
05-24-2018
03:39 PM
|
0
|
0
|
2344
|
BLOG
|
Among the best resources for learning the ArcGIS API for Python are the sample notebooks at the developers website. A new sample notebook is now available that demonstrates how to perform a network analysis to find the best locations for new health clinics for amyotrophic lateral sclerosis (ALS) patients in California. To access the sample, click on the image at the top of this post. I originally developed this notebook for a presentation that my colleague Pat Dolan and I gave at the Esri Health and Human Services GIS Users conference in Redlands, California in October. Although network analysis is available in many of Esri's offerings, we chose the Jupyter Notebook, an open-sourced browser-based coding environment, to show the attendees how they could document and share research methodology and results using the ArcGIS API for Python. This sample notebook provides a brief introduction to network analysis and walks you through our methodology for siting new clinics, including accessing the analysis data, configuring and performing analyses, and displaying the results in maps.
... View more
12-22-2017
11:54 AM
|
0
|
0
|
1568
|
BLOG
|
One of the great things about working in the Lab is you get to experiment with the new goodies from our core software developers before they are released. When I heard that version 1.2 of the ArcGIS API for Python would include a new module for raster functions, I could not wait to give it a try. Now that v.1.2 of the API is released, I can finally show you a Jupyter Notebook I built which has an example of a weighted overlay analysis implemented with raster functions. The following is a non-interactive version of that notebook which I exported to HTML. I hope it will give you some ideas for how you could use the ArcGIS API for Python to perform your own raster analysis. Finding Natural and Accessible Areas in the State of Washington, USA The weighted overlay is a standard GIS analysis technique for site-suitability and travel cost studies. This notebook leverages the new "arcgis.raster.functions" module in the ArcGIS API for Python 1.2 to demonstrate an example of a weighted overlay analysis. This example attempts to identify areas in the State of Washington that are "natural" while also being easy to travel within based on the following criteria: elevation (lower is better) steepness of the terrain (flatter is better) degree of human alteration of the landscape (less is better) The input data for this analysis includes a DEM (Digital Elevation Model), and a dataset showing the degree of human modification to the landscape. In general, weighted overlay analysis can be divided into three steps: Normalization: The pixels in the input raster datasets are reclassified to a common scale of numeric values based on their suitability according to the analysis criteria. Weighting: The normalized datasets are assigned a percent influence based on their importance to the final result by multiplying them by values ranging from 0.0 - 1.0. The sum of the values must equal 1.0. Summation: The sum of the weighted datasets is calculated to produce a final analysis result. We'll begin by connecting to the GIS and accessing the data for the analysis. Connect to the GIS In [1]: # import GIS from the arcgis.gis module
from arcgis.gis import GIS
# Connect to the GIS.
try:
web_gis = GIS("https://dev004543.esri.com/arcgis", 'djohnsonRA')
print("Successfully connected to {0}".format(web_gis.properties.name))
except:
print("") Enter password:········
Successfully connected to ArcGIS Enterprise A Search the GIS for the input data for the analysis Human Modified Index In [2]: # Search for the Human Modified Index imagery layer item by title
item_hmi = web_gis.content.search('title:Human Modified Index', 'Imagery Layer')[0]
item_hmi Out[2]: Human Modified Index A measure of the degree of human modification, the index ranges from 0.0 for a virgin landscape condition to 1.0, for the most heavily modified areas. Imagery Layer by djohnsonRA Last Modified: July 06, 2017 0 comments, 2 views Elevation In [3]: # Search for the DEM imagery layer item by title
item_dem = web_gis.content.search('title:USGS NED 30m', 'Imagery Layer')[0]
item_dem Out[3]: USGS NED 30m The National Elevation Dataset (NED) is the primary elevation data product of the USGS. This version was resampled to 30m from source data at 1/3 arc-second resolution and projected to an Albers Equal Area coordinate system. Imagery Layer by djohnsonRA Last Modified: July 06, 2017 0 comments, 8 views Study area boundary and extent In [4]: # Search for the Ventura County feature layer item by title
item_studyarea = web_gis.content.search('title:State of Washington, USA',
'Feature Layer')[0]
item_studyarea Out[4]: State of Washington, USA State of Washington, USA Feature Layer Collection by djohnsonRA Last Modified: July 07, 2017 0 comments, 2 views In [5]: # Get a reference to the feature layer from the portal item
lyr_studyarea = item_studyarea.layers[0]
lyr_studyarea Out[5]: <FeatureLayer url:"https://dev004543.esri.com/server/rest/services/Hosted/Washington/FeatureServer/1"> Get the coordinate geometry of the study area In [6]: # Query the study area layer to get the boundary feature
query_studyarea = lyr_studyarea.query(where='1=1')
# Get the coordinate geometry of the study area.
# The geometry will be used to extract the Elevation and Human Modified Index data.
geom_studyarea = query_studyarea.features[0].geometry
# Set the spatial reference of the geometry.
geom_studyarea['spatialReference'] = query_studyarea.spatial_reference Get the extent of the study area In [7]: # Import the geocode function
from arcgis.geocoding import geocode
# Use the geocode function to get the location/address of the study area
geocode_studyarea = geocode('State of Washington, USA',
out_sr= query_studyarea.spatial_reference) In [8]: # Get the geographic extent of the study area
# This extent will be used when displaying the Elevation, Human Modified Index,
# and final result data.
extent_studyarea = geocode_studyarea[0]['extent']
extent_studyarea Out[8]: {'xmax': -1451059.3770040546,
'xmin': -2009182.5321227335,
'ymax': 1482366.818700374,
'ymin': 736262.260048952} Display the analysis data Human Modified Index In [9]: # Get a reference to the imagery layer from the portal item
lyr_hmi = item_hmi.layers[0]
# Set the layer extent to geographic extent of study area and display the data.
lyr_hmi.extent = extent_studyarealyr_hmi Out[9]: Elevation In [10]: # Get a reference to the imagery layer from the portal item
lyr_dem = item_dem.layers[0]
# Set the layer extent to the geographic extent of study area and display the data.
lyr_dem.extent = extent_studyarealyr_dem Out[10]: Slope (derived from elevation via the Slope raster function) In [11]: # Import the raster functions from the ArcGIS API for Python (new to version 1.2!)
from arcgis.raster.functions import * In [12]: # Derive a slope layer from the DEM layer using the slope function
lyr_slope = slope(dem=lyr_dem,slope_type='DEGREE', z_factor=1)
# Use the stretch function to enhance the display of the slope layer.
lyr_slope_stretch = stretch(raster=lyr_slope, stretch_type='StdDev', dra='true')
# Display the stretched slope layer within the extent of the study area.
lyr_slope_stretch.extent= extent_studyarealyr_slope_stretch Out[12]: Extract the data within the study area geometry Use the Clip raster function to extract the analysis data from within the study area geometry Human Modified Index In [13]: # Extract the Human Modified Index data from within the study area geometry
hmi_clipped = clip(raster=lyr_hmi, geometry=geom_studyarea)
hmi_clipped Out[13]: Elevation#Elevation In [14]: # Extract the Elevation data from within the study area geometry
elev_clipped = clip(raster=lyr_dem, geometry=geom_studyarea)
elev_clipped Out[14]: Slope#Slope In [15]: # Extract the Slope data from within the study area geometry
slope_clipped = clip(raster=lyr_slope, geometry=geom_studyarea)
# Apply the Stretch function to enhance the display of the slope_clipped layer.
slope_clipped_stretch = stretch(raster=slope_clipped, stretch_type='StdDev',
dra='true')
slope_clipped_stretch Out[15]: Perform the analysis Step 1: Normalization Use the Remap function to normalize each set of input data to a common scale of 1 - 9, where 1 = least suitable and 9 = most suitable. In [16]: # Create a colormap to display the analysis results with 9 colors ranging
# from red to yellow to green.
clrmap= [[1, 230, 0, 0], [2, 242, 85, 0], [3, 250, 142, 0], [4, 255, 195, 0],
[5, 255, 255, 0], [6, 197, 219, 0], [7, 139, 181, 0], [8, 86, 148, 0],
[9, 38, 115, 0]] In [17]: # Normalize the elevation data
elev_normalized = remap(raster=elev_clipped,
input_ranges=[0,490, 490,980, 980,1470, 1470,1960, 1960,2450,
2450,2940, 2940,3430, 3430,3700, 3920,4100],
output_values=[9,8,7,6,5,4,3,2,1], astype='U8')
# Display color-mapped image of the reclassified elevation data
colormap(elev_normalized, colormap=clrmap) Out[17]: In [18]: # Normalize the slope data
slope_normalized = remap(raster=slope_clipped,
input_ranges=[0,1, 1,2, 2,3, 3,5, 5,7, 7,9, 9,12, 12,15,
15,100],
output_values=[9,8,7,6,5,4,3,2,1], astype='U8')
# Display a color-mapped image of the reclassified slope data
colormap(slope_normalized, colormap=clrmap) Out[18]: In [19]: # Normalize the Human Modified Index data
hmi_normalized = remap(raster=hmi_clipped,
input_ranges=[0.0,0.1, 0.1,0.2, 0.2,0.3, 0.3,0.4, 0.4,0.5,
0.5,0.6, 0.6,0.7, 0.7,0.8, 0.8,1.1],
output_values=[9,8,7,6,5,4,3,2,1], astype='U8')
# Display a color-mapped image of the reclassified HMI data
colormap(hmi_normalized, colormap=clrmap) Out[19]: Step 2: Weighting Use the overloaded multiplication operator * to assign a weight to each normalized dataset based on their relative importance to the final result. In [20]: # Apply weights to the normalized data using the overloaded multiplication
# operator "*".
# - Human Modified Index: 60%
# - Slope: 25%
# - Elevation: 15%
hmi_weighted = hmi_normalized * 0.6
slope_weighted = slope_normalized * 0.25
elev_weighted = elev_normalized * 0.15 Step 3: Summation Add the weighted datasets together to produce a final analysis result. In [21]: # Calculate the sum of the weighted datasets using the overloaded addition
# operator "+".
result_dynamic = colormap(hmi_weighted + slope_weighted + elev_weighted,
colormap=clrmap, astype='U8')
result_dynamic Out[21]: The same analysis can also be performed in a single operation In [22]: result_dynamic_one_op = colormap(
raster=
(
# Human modified index layer
0.60 * remap(raster=clip(raster=lyr_hmi, geometry=geom_studyarea),
input_ranges=[0.0,0.1, 0.1,0.2, 0.2,0.3, 0.3,0.4, 0.4,0.5,
0.5,0.6, 0.6,0.7, 0.7,0.8, 0.8,1.1],
output_values=[9,8,7,6,5,4,3,2,1])
+
# Slope layer
0.25 * remap(raster=clip(raster=lyr_slope, geometry=geom_studyarea),
input_ranges=[0,1, 1,2, 2,3, 3,5, 5,7, 7,9, 9,12, 12,15,
15,100],
output_values=[9,8,7,6,5,4,3,2,1])
+
# Elevation layer
0.15 * remap(raster=clip(raster=lyr_dem, geometry=geom_studyarea),
input_ranges=[-90,250, 250,500, 500,750, 750,1000, 1000,1500,
1500,2000, 2000,2500, 2500,3000, 3000,5000],
output_values=[9,8,7,6,5,4,3,2,1])
),
colormap=clrmap, astype='U8')
result_dynamic_one_op Out[22]: Generate a persistent analysis result via distributed server based raster processing. Portal for ArcGIS has been enhanced with the ability to perform distributed server based processing on imagery and raster data. This technology enables you to boost the performance of raster processing by processing data in a distributed fashion, even at full resolution and full extent. You can use the processing capabilities of ArcGIS Pro to define the processing to be applied to raster data and perform processing in a distributed fashion using their on premise portal. The results of this processing can be accessed in the form of a web imagery layer that is hosted in their ArcGIS Organization. For more information, see Raster analysis on Portal for ArcGIS In [23]: # Does the GIS support raster analytics?
import arcgis
arcgis.raster.analytics.is_supported(web_gis) Out[23]: True In [24]: # The .save() function invokes generate_raster from the arcgis.raster.analytics
# module to run the analysis on a GIS server at the source resolution of the
# input datasets and store the result as a persistent web imagery layer in the GIS.
result_persistent = result_dynamic.save("NaturalAndAccessible_WashingtonState")
result_persistent Out[24]: NaturalAndAccessible_WashingtonState Analysis Image Service generated from GenerateRaster Imagery Layer by djohnsonRA Last Modified: July 07, 2017 0 comments, 0 views In [25]: # Display the persistent result
lyr_result_persistent = result_persistent.layers[0]
lyr_result_persistent.extent = extent_studyarea
lyr_result_persistent Out[25]: Data Credits: Human Modified Index: A measure of the degree of human modification, the index ranges from 0.0 for a virgin landscape condition to 1.0 for the most heavily modified areas. The average value for the United States is 0.375. The data used to produce these values should be both more current and more detailed than the NLCD used for generating the cores. Emphasis was given to attempting to map in particular, energy related development. Theobald, DM (2013) A general model to quantify ecological integrity for landscape assessment and US Application. Landscape Ecol (2013) 28:1859-1874 doi: 10.1007/s10980-013-9941-6 USGS NED 30m: Data available from the U.S. Geological Survey. See USGS Visual Identity System Guidance for further details. Questions concerning the use or redistribution of USGS data should be directed to: ask@usgs.gov or 1-888-ASK-USGS (1-888-275-8747). NASA Land Processes Distributed Active Archive Center (LP DAAC) Products Acknowledgement: These data are distributed by the Land Processes Distributed Active Archive Center (LP DAAC), located at USGS/EROS, Sioux Falls, SD. State of Washington: Esri Data & Maps
... View more
07-11-2017
11:22 AM
|
13
|
2
|
3381
|
BLOG
|
In the Spring of last year Dr. Camilo Mora of the University of Hawaii Manoa contacted our team. He wanted to know if we would be interested in developing an interactive web map to display the results of a research project he was leading into the effect of rising global temperatures on climatic conditions resulting in human mortality due to heat stress. We were glad to hear from Dr. Mora again. The year before we had developed a web map to display the results of his research into how climate change could affect the length of plant growing seasons. For this new study, Dr. Mora’s team analyzed 1763 documented lethal heat events and identified a global threshold beyond which daily mean surface air temperature and relative humidity became deadly. Using Earth Systems Models to predict daily values for temperature and humidity across the globe up to the year 2100, they estimated the likely number of lethal heat days annually under low, moderate, and high carbon emissions scenarios. Although his new research project was still in it's early stages, we found the initial results to be very compelling and we agreed to move forward with the project. Using preliminary data from their research, we explored some ideas for how to present the data and developed a couple of prototype applications. Several months later, we heard from Dr. Mora again. His team had completed their research, and he was ready to share his finalized data with us and to collaborate on the design of the final application. The time-frame was short. Dr. Mora and his team were writing the final drafts of a paper for publication in the journal Nature Climate Change. So we rolled up our sleeves, reviewed our initial prototypes, explored the finalized data, and then got straight to work. The application Heatwaves: Number of deadly heat days leverages the robust capabilities of the ArcGIS platform to distill complex scientific data into intuitive maps that enable users to interact with and understand the data. This was an interesting development project, not only for it's subject matter, but also on a technical level. So we thought it would be worthwhile to share some details about how we built the application. At the front-end of the application is a web map developed using jQuery, Bootstrap, and the ArcGIS API for Javascript. The map contains an image service layer which displays the number of lethal heat days at any location over land using a color ramp with a range from white to yellow to red to black representing 0 - 365 days respectively for any year from 1950 - 2100. You can select the year from a slider control or from a drop down list. The data on the annual number of lethal heat days for the years 1950 – 2005 are based on historic climate records. The data for the years 2006 - 2100 are based on the average of 20 Earth System Models developed for the Coupled Model Intercomparison Project Phase 5, under low, moderate, and high (i.e. "business as usual") carbon emissions scenarios (i.e. Representative Concentration Pathways, RCPs 2.6, 4.5, and 8.5 respectively). By selecting from a drop-down list of RCPs you can view the modeled results for the different carbon emissions scenarios. When you click on a location over land, a window appears with a line chart and a scatter plot that reveal further insights into the study results for that location. The line chart displays the trend in the annual number of lethal heat days at the location for each year of the study period. The scatter plot displays the temperature and humidity for each day of the selected year over a curve which represents the lethal heat threshold. Now let's take a look at some of the deeper technical details of this application. On the back-end of the application are two web services that deliver the data from the study results to the web application for display. These services are hosted on the Esri Applications Prototype Lab's GIS Portal. An image service provides the web application with the data for the annual number of lethal heat days for each year of the study period. The data source of the service is a mosaic dataset that defines a single point of access to a collection of single-band raster datasets of global extent. Each raster dataset contains the number of lethal heat days across the globe for a given year. For the historical period 1950–2005, the data for each year are stored in a single raster dataset. For the future period 2006–2100, the data for each year are stored in 3 raster datasets – one for each of the carbon emissions scenarios. The image service has two roles: 1). to provide the images showing the annual number of lethal heat days for display in the web map 2). to provide the data for the graph of the trend in time of the annual number of lethal heat days. To generate the images for the map layer, the mosaic dataset applies a raster function chain that dynamically clips the source raster datasets to the coastlines and applies a color ramp to convert the single-band source data into three-band color RGB output images. To provide the data for the trend graph, the service delivers the pixel values at a given location from each of the historic rasters and from the future rasters for the selected carbon emissions scenario. A geoprocessing service provides the data for the chart that plots the temperature and relative humidity for each day of a given selected year. The source data for this service are a collection of 36 NetCDF files that contain the daily values for temperature and relative humidity for the study period and for each carbon emissions scenario. Each file contains data for a twenty year period for either temperature or relative humidity for the historic period, or for one of the three carbon emissions scenarios. In total, the files use 17 GB of storage and contain 12,570,624 unique points of data. To build this service, we started by developing a Python script with input parameters for the selected year, the selected carbon emissions scenario, and the coordinates of the location where the user clicked on the map. The script obtains the requested data from those files in four steps: The NetCDF files containing the relevant temperature and humidity data are identified from the first two input parameters. In-memory tables are created from the files using the Make NetCDF Table View geoprocessing tool. Queries are crafted to obtain the temperature and humidity values from the tables for each day of the selected year at the specified location. The results of the queries are sorted by day of the year and returned the to client application. The python script was then wrapped into a Python script tool and published as a geoprocessing service. The application also includes links three video animations showing the increase in lethal heat days over time for each of the carbon emissions scenarios. These videos were created using the animation tools in ArcGIS Pro. The videos representing rcp2.6, rcp4.5, and rcp8.5 can be viewed here, here, and here. Links to the videos and the source code of the application are also available from the application when you click the "About" button at the top right corner. In conclusion, we'd like to thank Dr. Mora and his team for their very important research and for the opportunity to contribute in our way towards helping to extend the reach of their findings. We enjoyed working with Dr. Mora and hope to collaborate with him on his future projects. In the news Deadly heat waves becoming more common due to climate change | CNN Deadly Heat Waves Could Threaten 3 in 4 People by 2100 | HUFFPOST Half of World Could See Deadly Heat Waves By 2100 | Climate Central Study shows deadly heat waves are becoming more frequent | Chicago Tribune A third of the world now faces deadly heatwaves as result of climate change | The Guardian By 2100, Deadly Heat May Threaten Majority of Humankind | National Geographic Deadly heatwaves could affect 74 percent of the world's population | University of Hawaii News Deadly Heat Waves Could Endanger 74% of Mankind by 2100, Study says | inside climate news Billions to Face 'Deadly Threshold' of Heat Extremes by 2100, Study Finds | EcoWatch Killer Heat Waves Will Threaten Majority of Humankind by Century's End | Alternet
... View more
06-22-2017
06:03 PM
|
1
|
0
|
4494
|
POST
|
Atma, that sounds very cool to be able to access raster functions as python functions. I look forward to getting my hands on that capability! It's good to know your team is already working on it. Thanks for the tip. That link you provided actually was my starting point for trying out this idea. Thanks again!
... View more
04-12-2017
01:15 PM
|
0
|
0
|
664
|
POST
|
I'm trying to figure out how to wrap a raster function chain into a python function that has parameters for the input raster(s) of the function chain. I also need to display the output in the map widget. My eventual goal is to create wrappers for function chains which have multiple input branches. But for now, here is a very simple example I have tried which requires only one input raster parameter. # code for context. type(dem_item) arcgis.gis.item dem_item.layers <ImageryLayer url:"http://dev004545.esri.com/arcgis/rest/services/Hosted/NED_1r3_CONUS_Albers_30m/ImageServer"> # python wrapper function def slope_template(in_ras): return { "rasterFunction" : "Slope", "rasterFunctionArguments" : { "DEM" : in_ras } } # Add the output of the function to the map widget map.add_layer(slope_template(dem_item.layers[0])) Running this code produces the following error. ---------------------------------------------------------------------------TypeError Traceback (most recent call last)<ipython-input-26-4095d219143b> in <module>()----> 1 map1.add_layer(slope_template(dem_item.layers[0])) C:\Program Files\ArcGIS\Pro\bin\Python\envs\arcgispro-py3\lib\site-packages\arcgis\widgets\__init__.py in add_layer(self, item, options) 142 143 self._addlayer = json.dumps(js_layer)--> 144 elif 'layers' in item: # items as well as services 145 if item.layers is None: 146 raise RuntimeError('No layers accessible/available in this item or service') TypeError: argument of type 'NoneType' is not iterable
... View more
04-12-2017
12:02 PM
|
0
|
2
|
1077
|
POST
|
Is there any doc or guidance for defining a client-side raster function template with chained raster functions that goes into more detail than the info in the doc for raster function objects? I have composed a somewhat complex function chain in the template editor in ArcGIS Pro which I need to convert to a client-side template. The function chain has three separate branches, each of which starts with a different source raster, and has about 12 functions consisting of Clip, Slope, Remap, Arithmetic, Cell Statistics, and Colormap. These are all "well known functions" and are documented individually in the above link (with the exception of Cell Statistics, which I can substitute with the Local function). The function chain was not very difficult to build in Pro's template editor. However, the complexity of this function chain goes far beyond the example in the REST doc which simply chains the Remap and Colormap functions. Are there any tools that could help convert the rft.xml file I generated in Pro to a nice clean JSON raster function object, or do I just need to roll up my sleeves and start from the beginning with a text editor? My eventual goal is to wrap the template into a python function and use it with the Python API.
... View more
04-07-2017
04:17 PM
|
0
|
0
|
674
|
Title | Kudos | Posted |
---|---|---|
1 | 12-29-2020 03:47 PM | |
1 | 07-15-2014 10:59 AM | |
2 | 06-24-2014 05:40 AM | |
1 | 11-01-2019 09:50 AM | |
3 | 11-30-2018 05:45 PM |
Online Status |
Offline
|
Date Last Visited |
09-18-2024
09:44 PM
|