Field Map an array of values to a single field

420
3
05-08-2023 02:13 PM
MichaelKarikari1
New Contributor III

So I trying to take incoming json which includes a simple array of values (array can be dynamic in the sense it will sometimes have multiple values and sometime will have zero) and send it to a feature layer field called "sources":

Sample Input

"sources": [
        "water",
        "Reactor"
    ],
    "location": {
        "latitude": 38.8815081,
        "longitude": -77.111314,
        "altitude": 81.20000457763672
    }

Sample Output

sourceslatitudelongitudealtitude
water, reactor38.8815081-77.11131481.20000457763672

 

I'm only concerned with how to handle the first field as the number of values can vary for each geoevent/row inserted.

Any ideas on best way to handle would be appreciated.

 

 

3 Replies
JeffSilberberg
Occasional Contributor III

 

In the feed first flatten the array and then in the analytic use a Field Mapper to change input location_0_lattitude to output latitude --  and so on. 

 

0 Kudos
MichaelKarikari1
New Contributor III

Hi @JeffSilberberg , my issue is actually with the "sources" values, if that array can vary in # of values, how would I iterate through and have it added to the output field if it isn't known how many items will appear in sources array ahead of time?

Im trying to determine the best way of looping through array so all values can be added

0 Kudos
JeffSilberberg
Occasional Contributor III

Michael, 

    That's the issue. Neither geoEvent or Velocit realy support a for, do or while loop today. 

     In one case I knew the max upper end of the occurrences in the array and just made sure my sample had the necessary data so that when I derived the schema I got all the location_n_ occurrences, and then flattened the data in a large Field Mapper.  Think I had a top of 5 and 0 was always present 1 through 4 I tested for content.  

    In another case the array could contain anywhere from one to 30 occurrences, and the only logical way I could find to handle this data was to refactor it in a preprocessor (AWS EC2 Instance) and then post the new  flattened JSON transactions to the feed. 

0 Kudos