Building a Data Driven Organization, Part #6: Consume a REST API Without Coding

1657
0
09-08-2021 10:07 AM
BruceHarold
Esri Regular Contributor
2 0 1,657

"You'll need to write a connector".  That's what people used to say when a need arose to build an integration between ArcGIS and another system.  Nowadays however, the information technology landscape has matured and how apps communicate with each other has centered on a handful of patterns that everyone (who wants to stay relevant) uses.  ArcGIS Data Interoperability walks and talks in this space.

A clear winner is called REST, and since I'm not a computer scientist I will not go into what you can read for yourself, all I care about is that pretty much everything on the web can send and receive data in a way I can easily use (mostly JSON, with some holdouts still using XML and a few stretching usability with protocol buffer payloads but I'll keep an eye on them for you, I think that will pass 😉.) with ArcGIS Data Interoperability.

I need to add a graphic before the search engines get bored with me and I don't get indexed.  Here is a  workspace (StubOutAPIJSONReading.fmw in the post download) I used to initially explore an API (details to follow):

Exploring an APIExploring an API

My sample API is published by the good folks at Clarity Movement.

Clarity was founded in 2014 to tackle the global air pollution crisis and now provides cost-effective, scalable, and reliable air quality monitoring to customers in more than 60 countries around the world. Clarity's solution enables governments and communities to collect higher-resolution air quality data by supplementing existing regulatory monitors with dense networks of continuously calibrated air quality sensors.

I like Clarity's Measurements endpoint as a good example of the most common pattern you will encounter, namely handling JSON data returned from an HTTP call.  Before you panic about things like transport protocols and JSON parsng, relax!  Data Interoperability handles it all for you, which is just as well because otherwise you would have to read stuff like this:

 

 

 

 

{
	"_id": "6137a995cd42fd51fcda7083",
	"device": "609935dcc9348052e0c5d917",
	"deviceCode": "AY989QV6",
	"time": "2021-09-07T17:00:00.000Z",
	"location": {
		"coordinates": [
			-120.90439519586954,
			36.01958888490562
		],
		"type": "Point"
	},
	"recId": "averaged:AY989QV6:hour:2021-09-07T18:00:00",
	"characteristics": {
		"relHumid": {
			"value": 30.428397178649904,
			"weight": 4,
			"raw": 30.428397178649904
		},
		"temperature": {
			"value": 30.882064819335939,
			"weight": 4,
			"raw": 30.882064819335939
		},
		"pm2_5ConcNum": {
			"value": 24.072547912597658,
			"weight": 4,
			"raw": 24.072547912597658
		},
		"pm2_5ConcMass": {
			"value": 13.646130166709775,
			"weight": 4,
			"raw": 21.126213550567628,
			"calibratedValue": 13.646130166709775,
			"epaNowCast": 15.097605519225573
		},
		"pm1ConcNum": {
			"value": 22.670581817626954,
			"weight": 4,
			"raw": 22.670581817626954
		},
		"pm1ConcMass": {
			"value": 12.186892986297608,
			"weight": 4,
			"raw": 12.186892986297608
		},
		"pm10ConcNum": {
			"value": 24.33531427383423,
			"weight": 4,
			"raw": 24.33531427383423
		},
		"pm10ConcMass": {
			"value": 28.42475652694702,
			"weight": 4,
			"raw": 28.42475652694702
		},
		"no2Conc": {
			"value": -6.413380280137062,
			"weight": 4,
			"raw": -6.413380280137062
		},
		"pm2_5ConcMass_24HourRollingMean": {
			"value": 15.003665288163619,
			"weight": 88,
			"raw": 27.317325342785229,
			"calibratedValue": 15.003665288163619
		},
		"pm2_5ConcNum_24HourRollingMean": {
			"value": 26.923120065168903,
			"weight": 88,
			"raw": 26.923120065168903
		},
		"pm10ConcMass_24HourRollingMean": {
			"value": 37.67677915096283,
			"weight": 88,
			"raw": 37.67677915096283
		},
		"pm10ConcNum_24HourRollingMean": {
			"value": 27.273549459197306,
			"weight": 88,
			"raw": 27.273549459197306
		}
	},
	"average": "hour"
}

 

 

 

 

The job is to turn this data into a hosted feature service, and automate service maintenance.

A boon of having a GIS background is you likely know some Python, and JSON payloads like the above look just like Python dictionaries and lists.  You only need to learn a couple of tricks before you can tackle pretty much any REST API.

Trick #1:  Using HTTPCaller.

HTTPCallerHTTPCaller

 

 HTTPCaller lets you simply fill in a form to make a web call and receive a response (JSON in this case).  First you must read your API doc and determine the required parameters, plus any optional ones you want, and whether to use a GET or POST method.  GET is usually for shorter URLs, POST supports longer URLs and also uploads supplied in a Body.  Another aspect is authentication.  Many APIs require an API key (like you see here) or a token.  Typically keys do not expire and tokens do, plus tokens require a generation step (which may be done separately via HTTP or via OAuth2 in a web connection you set up for your account).  My example requires a key.  My call requests a JSON array of 10 air quality measurements and returns it in an attribute named _response_body.

Trick #2:  Unpacking JSON

I'm headed to ingesting arbitrary numbers of air quality measurements but first I must figure out how to extract data from the JSON.  I requested 10 air quality measurements so the returned JSON will be an array.  Each measurement in the array will have the same schema used by the fields of my features - the above JSON is a measurement feature.  I need to figure out extracting field values.  The simplest way to do this is write one array feature's JSON to a file and use the data-aware capability of JSONExtractor to let me build queries without coding any JSON queries.

If I temporarily set the JSONExtractor to read from a JSON file then I get a handy-dandy picker to build my JSON queries.  Then I can copy-paste the JSONExtractor into a production tool, set it to read from the incoming feature's JSON attribute it will extract and expose all the available field values.  How easy is that!

Here is my JSONExtractor while I'm reading from a file and I'm building my queries:

JSONExtractorJSONExtractor

 

Now its all the way down hill to creating and maintaining my feature service.

Clarity2GDB.fmw uses the JSON exploration done above to write out a feature class in my Pro project's home geodatabase - there is some renaming of attributes done, and of course they get their desired types set when writing.

Clarity2GDBClarity2GDB

Then after creating my feature service from the geodatabase feature class I can recycle the work to maintain the feature service with RefreshClarityService.fmw, which only differs at the writer step. You'll notice only changed features are written to the target service, the workspace has a parameter that sets the history interval for my features, features in the service older than that get aged out.

RefreshClarityServiceRefreshClarityService

 This final tool can be scheduled or run on demand.

That's it!  I have conquered the complexities of web integration and have a feature service I can use to power my maps, apps and dashboards.

Feature ServiceFeature Service

The blog download has the ETL tools I used, less a functional API key, please contact Clarity if you want to test and implement an integration.