Select to view content in your preferred language

XML can I use this source?

4706
3
07-08-2015 12:22 PM
WilliamMeyers1
Deactivated User

http://brgov.com/reports/public/brtrxml.xml

This is the active traffic incidents from our dispatch system.  Any help would be great.

0 Kudos
3 Replies
ChrisSmith7
Honored Contributor
RJSunderman
Esri Regular Contributor

Hello William –

I had no problems bringing in your traffic incidents feed from the City of Baton Rouge. I had to remove the input’s default 'application/xml' specification from my configured GeoEvent input's 'Acceptable MIME Types’ parameter. I also had to specify trafficincident as the 'XML Object Name' so that the input would know how to extract individual events from the feed's structure.

I cannot be sure from the feed whether using the location_number attribute as a TRACK_ID is appropriate. Identifying a field within the feed's data which can be used to uniquely identify each event will be fairly important. The input is receiving all available event records every polling cycle, and if we elect to broadcast the event data out a stream service, or update features in a feature service, we'll need a TRACK_ID in order to visualize the data properly on a map.

Because the feed’s structure includes the date as an attribute of the trafficincidentlist structure, rather than as part of each trafficincident item, we have a small challenge to overcome. More on that in a moment.

Attached are illustrations of my configuration of the ‘Poll an External Website for XML‘ input and the GeoEvent Definition I configured my input to use. I allowed the input to create an initial GeoEvent Definition for me, then copied the generated one and configured my input to use my copy of the event definition. This is a fairly simple best practice.

Hope this information helps –

RJ

RJSunderman
Esri Regular Contributor

The http://brgov.com/reports/public/brtrxml.xml feed provides the current date as an attribute of the trafficincidentlist structure. Since each inbound traffic record only specifies a local time, it is difficult for GeoEvent to determine a proper epoch timestamp in milliseconds to associate with each event.

We can work with this. By polling the same feed, in a different way, we can cache the header information in a feature service’s feature class, an then use that cache to enrich each of the individual events we receive from the feed.

Consider the attached GeoEvent Definition illustration.

This event definition specifies that only the trafficincidentlist.time and trafficincidentlist.date attributes should be pulled from the feed. Further, the trafficincidentlist.time is to be handled as a Date, not as a String. If we configure a new Poll an External Website for XML‘ input, we can poll the same feed. This time, however, I leave the 'XML Object Name' property unspecified, so this new input will retrieve the base date for every traffic incident. I need to specify an ‘Expected Date Format’ so that the input knows not to expect a time value as part of the received date string. (Refer to attached illustration of the Poll an External Website for XML‘ input.)

I can then incorporate this second input into my GeoEvent Service, using a Field Mapper to map the retrieved Date value to a schema consistent with a feature class in my geodatabase. I use a Field Calculator to hard-code the URL being polled for data as a TRACK_ID for these “features”. I use an ‘Update a Feature’ output to update the base date, in epoch milliseconds, as a feature in the feature class. This updates our cached epochbase attribute as a feature in a feature service. (Refer to illustration detailing the use of Field Mapper and Field Calculator with the fs-out output updating the feature service.)

Now we can retrieve the epochbase attribute from the feature service and use it to enrich each incoming traffic event from the city’s feed.

To do this, we add additional fields (feedidentifier, hours, and minutes) to our BatonRouge-TrafficIncidents event definition. These fields are not provided by the feed, but adding them to the event definition provides us attributes into which we can write data. We also add a Field Enricher to enrich the incoming traffic events with the Date from the feature service ... and a Field Calculator to extract the hours and minutes as substrings from each event’s time attribute.

An additional Field Calculator can then be used to add the hours and minutes to the epochbase – creating a fully-qualified date/time value for each event. (Refer to illustration ‘Final GeoEvent Service’).

Admittedly, this is a lot to go through to pull information from a XML feed’s header and incorporate it into individual event records extracted from a feed’s list. But it illustrates several concepts of what you can do using a combination of processors to manipulate data obtained from a feed. Also, it appears that the GeoEvent input, configured to expect only MM/dd/yy for the Date, assumes that the date must be local. But when the Date is written to a feature service, clients retrieving the value will assume that the value has been expressed in epoch milliseconds GMT, so displaying the manufactured date in a web map will probably be artificially offset from GMT to your server's local time. To complete the solution we really should correct the locally reported Baton Rouge time by adding +5 hours to push the value forward and represent it as a GMT / UTC value, when caching it in a feature service.

Hope this information is helpful –

RJ