So I trying to take incoming json which includes a simple array of values (array can be dynamic in the sense it will sometimes have multiple values and sometime will have zero) and send it to a feature layer field called "sources":
Sample Input
"sources": [
"water",
"Reactor"
],
"location": {
"latitude": 38.8815081,
"longitude": -77.111314,
"altitude": 81.20000457763672
}
Sample Output
sources | latitude | longitude | altitude |
water, reactor | 38.8815081 | -77.111314 | 81.20000457763672 |
I'm only concerned with how to handle the first field as the number of values can vary for each geoevent/row inserted.
Any ideas on best way to handle would be appreciated.
In the feed first flatten the array and then in the analytic use a Field Mapper to change input location_0_lattitude to output latitude -- and so on.
Hi @JeffSilberberg , my issue is actually with the "sources" values, if that array can vary in # of values, how would I iterate through and have it added to the output field if it isn't known how many items will appear in sources array ahead of time?
Im trying to determine the best way of looping through array so all values can be added
Michael,
That's the issue. Neither geoEvent or Velocit realy support a for, do or while loop today.
In one case I knew the max upper end of the occurrences in the array and just made sure my sample had the necessary data so that when I derived the schema I got all the location_n_ occurrences, and then flattened the data in a large Field Mapper. Think I had a top of 5 and 0 was always present 1 through 4 I tested for content.
In another case the array could contain anywhere from one to 30 occurrences, and the only logical way I could find to handle this data was to refactor it in a preprocessor (AWS EC2 Instance) and then post the new flattened JSON transactions to the feed.