Select to view content in your preferred language

Record only first record for track

1214
6
12-06-2023 09:01 AM
KevinPiraino2
Regular Contributor

Our organization is currently utilizing Velocity to deploy the Winter Weather Operations solution. One of our Transportation manager's goals for using this solution is to more accurately track material usage across an event and across a season. While reviewing our AVL data return, we found that all of the material values returned by the AVL system are summed values from when the device was first field deployed (day 1) making these raw material values not useful in their current form. 

I currently am attempting to employ a workflow that would capture the first record for a track when an event starts and storing this information in a separate stream or feature layer. This "First Record" stream or feature layer would then be used as the "zero" or starting value for each track in which to calculate the running total of material use across an event. 

I have so far attempted two separate workflows to capture this "First Record" data. Those two workflows can be seen in the attached screenshots. 

  1. Use the "Detect Incidents" tool in a real time analytic (RTA) to determine when a vehicle has entered/exited a pre-defined static "Operational Boundary" polygon (see screenshot for configuration of "Detect Incidents" tool). Then filter the records based on the returned "Incident Status" value using a where clause of "Incident Status == 'Started'". Store filtered records in a separate stream or feature layer.
  2. Use the "TrackFieldWindow" Arcade function to find the first record in an RTA. Use the calculate field tool to build a new field that uses the "TrackFieldWindow" Arcade function to return a boolean (true/false) value (see screenshot for this Arcade expression). Filter subsequent records based on this boolean field, only returning the records that contain a True value for this field. 

So far workflow 1 seems the most promising and works to some degree when testing during non-events, but during an actual event with many more active vehicles I am finding that while values are being calculated, they are not always accurate over long durations. I am finding that the "First Record" is being reset after a certain time frame. I am not sure if this is an issue with how I have the output Stream or Feature Layer configured or if it may be an issue with the Stateful configuration not able to handle that many records over that duration of time. I have had less time to test Workflow 2 and am less confident in it's ability to capture the "First Record", but wanted to try something different in case workflow 1 wouldn't function.

I am looking for any advice or suggestions on how to accomplish the capture of the "First Record" values as well as any additional suggestions on how to best accomplish calculating the running material total values for an event/season using the "First Record" values as the zero value.

0 Kudos
6 Replies
JeffSilberberg
Frequent Contributor

I would use Option #1 and set the Target Time Window much longer maybe a Max allowed shift plus an hour.  Then maybe an ignition off as a backup End condition. 

Unfortunately, you get an End when the timer expires not an Expired, and the Next transaction, either way,  will be another Started so you are getting a Started every ten minutes here.  Then with the longer timer, maybe do a Big Data Analytic that runs once every hour (Shift), joins the started Layer to the detail layer, and calculates the usage to that point into a new data element in a new feature layer. 

0 Kudos
KevinPiraino2
Regular Contributor

Jeff, 

Thanks for the quick response. I forgot to mention in my original post, but the "Target Time Window" of 10 minutes is actually an adjustment I made after the first snow event / live test event. The original "Target Time Window" was set at 7 days in order for the auto close/end condition wouldn't be met until after the event ended. Even with the much longer "Target Time Window", I found that values were be reset after what seemed to be a few hours. Unfortunately, I also think having a long "Target Time Window" may affect the performance of the RTA due to the Stateful configuration settings and too many features being stored in memory, although this is only an assumption, but I am not positive on this aspect.

According to the ESRI documentation,  (https://doc.arcgis.com/en/iot/analyze/perform-real-time-analysis.htm), the "Target Time Window" does effect how many features are stored in memory and used in the comparison process, but I couldn't find any information about when the features are purged if they get set back to their original state (in the case of Detect Incidents, if this would mean the first feature following a purge would have a status of "Started"). 

KevinPiraino2_0-1701883526889.png

 

0 Kudos
JeffSilberberg
Frequent Contributor

7 Days seems a little on the other extreme.  But if you are not comfortable with what you see versus what you are expecting, maybe open a support case.   I know there were issues around the timer and the state set, but I thought those were resolved in one of the four dot releases. 

"I found that values were be reset after what seemed to be a few hours. "  What values and are you sure these were not being reset based on your data stream.  

 

https://doc.arcgis.com/en/iot/analyze/detect-incidents.htm

 

 

0 Kudos
KevinPiraino2
Regular Contributor

Yes, I would agree that 7 days may be a little extreme. I initially set it for 7 days as more of a fail safe to always capture the starting record regardless of how long the event typically lasts. I may end up changing it back to may be 1-2 days to see if that changes anything. 

Regarding the reset of values, I agree that it may not be the "Detect Incidents" tool thats causing the resetting of the starting values. It may be how I have the subsequent Winter Fleet Tracking RTA set up to ingest the "First Record" stream or feature layer as this is where the summation calculation is completed and where I noticed the "resetting" of the zero value. It was unclear to me what type of input I should use in the Winter Fleet Tracking RTA for the "First Record" data (stream layer or feature layer). Do you have any suggestions?

My main goal in this post was to try and find if any other users were having similar issues with their data streams from their AVL systems not being "zeroed" out at the beginning of each "event" and if any other users had set up similar workflows to record the "First Record". 

0 Kudos
JeffSilberberg
Frequent Contributor

Kevin, did you find any answers?  If you want to do a call and look at this next week, DM me your contact information. 

 

0 Kudos
KevinPiraino2
Regular Contributor

In case anyone else is having this same issue, I believe I found a solution to this problem by using the "Control Event Volume" tool in a separate Real Time Analytic and ingesting the output from the RTA into a the main RTA.  See below for the generic workflow I have employed for my own project:

  1. Real Time Analytic (RTA) "Get First Value":
    1. Input --> Live Feed (AVL data from snow plows).
    2. Control Event Volume --> set to as long as RTA is running and "Max Events Per Interval" equal to 1
    3. Output --> Feature Layer "First Value"
  2. Real Time Analytic (RTA) "Ingest First Value":
    1. Input --> Live feed (AVL Data from snow plows) & Feature Layer "First Value" 
    2. Join Live feed to "First Value" based on TrackID 
      1. Summarize joined "First Value" fields by "MAX" value
    3. Calculate new running total fields by using "Material Value" - "First Value" = "Material Total"
    4. Output --> Feature Layer "Tracks" 

 

0 Kudos