Handling geoevent stream service with null values

2870
5
Jump to solution
07-22-2021 06:47 AM
kavi88
by
Occasional Contributor

Hey guys,

I am using the Poll an External Website for JSON input connector to stream data, I have a multi-cardinal JSON structure (attached bellow) so i am using the multicardinal field splitter processor in order to flatten the structure and be able to access the coordinates, knowing that the coordinates (lon, lat) are typed as string and need to be converted to double and then construct a geometry using a regular expression ( ['{'+' "x":' + X + ',' + ' "y":' + Y + ',' + ' "spatialReference":{"wkid":4326} }']).

I think that the multicardinal field splitter processor is doing a great job flattening the structure the way i want to be , thus i am getting this error :

Expression ['{'+' "x":' + X + ',' + ' "y":' + Y + ',' + ' "spatialReference":{"wkid":4326} }'] evaluation failed: operator+(arg0:[NonGroup], arg1:[NonGroup]) evaluation failed because of GeoEvent field 'X' of type [Double] and value 'null' cannot be used as an argument arg1:[NonGroup]

I am wondering if i am doing it the right way ? or what am i missing ?  @RJSunderman 

 

 

Tags (1)
3 Solutions

Accepted Solutions
RJSunderman
Esri Regular Contributor

Hey @kavi88 --

When using a Field Calculator to construct a "Geometry" you are actually calculating a String representation of a Geometry. When I need to confirm the string calculation I will often configure a Field Calculator to write its string representation to a String attribute field and then map the String to a Geometry attribute field. You can configure a Field Calculator to write its string representation directly into a Geometry attribute field, but the single step means that you are asking for an implicit type cast from String -- the value calculated as a single-quoted literal -- to a Geometry. If the string value does not exactly match the required formatting for a Point geometry object, the Field Calculator's attempt to write its string into a Geometry field will fail.

So, to Eric's point, you might want to route event records emitted from the GEOM_CONSTRUCTION Field Calculator you configured to a JSON File so that you can get a good look at the String the processor constructed for you, to make sure it matches the formatting of a Point geometry object.

You can probably drop the two Field Calculator processors LatConverter and LonConverter from the event processing workflow. You can configure the MAPPING FIELDS Field Mapper to map your latitude and longitude attribute values from String to Double by simply mapping the attribute values into Double fields. This is just another implicit cast, like when using Field Calculator to compute a string representation of a geometry, but writing the computed string into a Geometry field.

If I had to guess, the problem you're having is probably in the serialized event schema flattening. Placing five Multicardinal Field Splitter processors in series is more than I've ever had to do to simplify a hierarchical data structure. It's either that, or the string representation Point geometry object being calculated doesn't match the ArcGIS REST API specification of a Point geometry.

As a debugging step, you might try using dot notation to pull a single pair of latitude and longitude values our of the hierarchical data structure, using a Field Mapper to map the entirety of the data structure down to an event record whose GeoEvent Definition has exactly two Double attributes (one named lat and one named lon). Then work with that very simple event record to debug the field calculation you need to perform to construct a JSON representation of a Point geometry object.

  • disruptions[0].impacted_objects[0].impacted_stops[0].stop_point.coord.lat    =>  lat
  • disruptions[0].impacted_objects[0].impacted_stops[0].stop_point.coord.lon    =>   lon

I wrote the above without actual data to look at and test, so I am not 100% sure I have the notation correct. If you need help with this I would ask that you open a technical support incident with Esri Support.

What I'm trying to do above is take the zero-th value from each group element whose cardinality is 'Many' (indicating the JSON element is a zero-based indexed list of values) to pull a single "stop point" coordinate's latitude and longitude out so that the values can be used in a Field Calculator. You'll still need to use the Multicardinal Field Splitters eventually so that you run calculations on all of the stop points, but the above can help you debug to make sure the string calculation of the Point geometry object is being done correctly.

Hope this helps --
RJ

cross-reference:  JSON Data Structures - Working with Hierarchy and Multicardinality 

View solution in original post

kavi88
by
Occasional Contributor

Hey @RJSunderman , EricIronside,

Thank you for your insights and the explanation.

I routed event records emitted from the GEOM_CONSTRUCTION Field Calculator i configured to a JSON File using a dot notation to pull a single pair of latitude and longitude values our of the hierarchical data structure, using a Field Mapper to map the entirety of the data structure down to an event record whose GeoEvent Definition has exactly two Double attributes (lon, lat), in addition i calculated the eometry using the regular expression that was suggested by @EricIronside which resulted in having this kind of output :

{
"Train_id": "ad37a93f-63f9-4bb4-980b-405b2996299e",
"lon": 6.14244,
"lat": 46.21022,
"geom": {
"x": 6.14244,
"y": 46.21022,
"spatialReference": {
"wkid": 4326
}
}

And also this kind of output

{
"Train_id": "ad37a93f-63f9-4bb4-980b-405b2996299e",
"lon": null,
"lat": null,
"geom": null
}

Checking the Geoevent logs, i found this error :

Expression ['{ "x":' + lon + ', "y":' + lat + ', "spatialReference" : { "wkid" : 4326 } }'] evaluation failed: operator+(arg0:[NonGroup], arg1:[NonGroup]) evaluation failed because of GeoEvent field 'lon' of type [Double] and value 'null' cannot be used as an argument arg1:[NonGroup].

Which got me thinking that Geoevent is not capable of handling null values. Please correct me if I am wrong.

Thank you.

View solution in original post

0 Kudos
RJSunderman
Esri Regular Contributor

Hello @kavi88 ... I would say that GeoEvent Server is able to handle null value input. Attribute values can be null and there should not be a runtime exception generated that creates a fault in event record processing. That doesn't mean that you'll be able to calculate a derivative value if the input values are null or if attribute values cannot be used in the expression you configure a Field Calculator to use.

Suppose you receive some simple event record like:
{ "myDouble": 3.14159, "scaleFactor": 3.1, "calcResult": null }

Field Calculator configured with an expression myDouble * scaleFactor will be able to write the value 9.738929 into an existing field calcResult.

But if one or more of the attribute fields contain null values:
{ "myDouble": 3.14159, "scaleFactor": null, "calcResult": null }

You should expect to see some sort of error. You cannot multiple a Double and a null value, or implicitly cast a null or a literal string to a numeric value to allow a Field Calculator to compute a value. We do try not to make up data in cases where invalid values are received. We wouldn't want, for example, to assume a location of 0.0 latitude / 0.0 longitude because lat and lon values pulled out of a data structure were null.

Suppose, rather than computing a Double value we were simply trying to place two Double values into a descriptive string. An expression like the following:
'My Double is: ' + myDouble + ' and my Scale Factor is: ' + scaleFactor + '.'

Written into a String attribute would calculate a value something like:
"My Double is: 3.14159 and my Scale Factor is: 3.1."

If a null value were received for the scaleFactor an error message like the following is logged:
Expression ['My Double is: ' + myDouble + ' and my Scale Factor is: ' + scaleFactor + '.'] evaluation failed: EVALUABLE_EVALUATION_FAILED_CAUSE

The error message above is what is produced at the 10.9.x release. It may be that Field Calculator is logging less readable error messages at an earlier release, which would explain why you are seeing messages talking about arg0:[NonGroup], arg1:[NonGroup]. I know we improved the error messages that Field Calculator was logging at some point, but I don't remember which s/w release has those changes. Regardless, if an expression uses attribute field(s) whose value(s) are null ... you should probably expect to see some sort of error logged and the computed result receive a null value.

The problem you are trying to solve has several different places where something can go wrong. I have frequently encountered, for example, data in a rich, complex hierarchical structure not necessarily being 100% homogenous across all of the levels in the hierarchy. It could easily be the case, for example, that the "impacted_objects" for a "disruption" do not have a "stop point" defined. It may be that there is no value at a hierarchical path disruptions[idx].impacted_objects[idx].impacted_stops[idx].stop_point.coord.lat or if an attribute exists at that level in the data structure, its value is null.

I would assume that after you use the serialized multicardinal field splitter processors to flatten out all of the levels in the data structure, you'll have to use a couple of filters to test whether valid lat and lon values can be retrieved and log a "disruption" identifier to a file when a "stop_point" location cannot be calculated rather than trying to calculate a string representation of a geometry using null values.

- RJ

View solution in original post

5 Replies
EricIronside
Esri Regular Contributor

Hey @kavi88 

Everything looks in order, but just to be sure, here is the equation I would use:

'{ "x":' + X + ', "y":' + Y + ', "spatialReference" : { "wkid" : 4326 } }'

An alternative would be to convert the X and Y to strings, but I don't think this is necessary.

 

'{"x":' + valueOf(X) + ',"y":' + valueOf(Y) + ',"spatialReference" : { "wkid" : 4326 } }'

If that doesn't work, then there is something in your schema that isn't what you expect. So I would send the events to a JSON File output and inspect the structure there to see if there is any remaining grouping.

 

 


RJSunderman
Esri Regular Contributor

Hey @kavi88 --

When using a Field Calculator to construct a "Geometry" you are actually calculating a String representation of a Geometry. When I need to confirm the string calculation I will often configure a Field Calculator to write its string representation to a String attribute field and then map the String to a Geometry attribute field. You can configure a Field Calculator to write its string representation directly into a Geometry attribute field, but the single step means that you are asking for an implicit type cast from String -- the value calculated as a single-quoted literal -- to a Geometry. If the string value does not exactly match the required formatting for a Point geometry object, the Field Calculator's attempt to write its string into a Geometry field will fail.

So, to Eric's point, you might want to route event records emitted from the GEOM_CONSTRUCTION Field Calculator you configured to a JSON File so that you can get a good look at the String the processor constructed for you, to make sure it matches the formatting of a Point geometry object.

You can probably drop the two Field Calculator processors LatConverter and LonConverter from the event processing workflow. You can configure the MAPPING FIELDS Field Mapper to map your latitude and longitude attribute values from String to Double by simply mapping the attribute values into Double fields. This is just another implicit cast, like when using Field Calculator to compute a string representation of a geometry, but writing the computed string into a Geometry field.

If I had to guess, the problem you're having is probably in the serialized event schema flattening. Placing five Multicardinal Field Splitter processors in series is more than I've ever had to do to simplify a hierarchical data structure. It's either that, or the string representation Point geometry object being calculated doesn't match the ArcGIS REST API specification of a Point geometry.

As a debugging step, you might try using dot notation to pull a single pair of latitude and longitude values our of the hierarchical data structure, using a Field Mapper to map the entirety of the data structure down to an event record whose GeoEvent Definition has exactly two Double attributes (one named lat and one named lon). Then work with that very simple event record to debug the field calculation you need to perform to construct a JSON representation of a Point geometry object.

  • disruptions[0].impacted_objects[0].impacted_stops[0].stop_point.coord.lat    =>  lat
  • disruptions[0].impacted_objects[0].impacted_stops[0].stop_point.coord.lon    =>   lon

I wrote the above without actual data to look at and test, so I am not 100% sure I have the notation correct. If you need help with this I would ask that you open a technical support incident with Esri Support.

What I'm trying to do above is take the zero-th value from each group element whose cardinality is 'Many' (indicating the JSON element is a zero-based indexed list of values) to pull a single "stop point" coordinate's latitude and longitude out so that the values can be used in a Field Calculator. You'll still need to use the Multicardinal Field Splitters eventually so that you run calculations on all of the stop points, but the above can help you debug to make sure the string calculation of the Point geometry object is being done correctly.

Hope this helps --
RJ

cross-reference:  JSON Data Structures - Working with Hierarchy and Multicardinality 

EricIronside
Esri Regular Contributor

It occurred to me after reading RJ's response that the cause of the issue might be that the X or Y values are not being pulled out as expected.  Taking another look at the error "...evaluation failed because of GeoEvent field 'X' of type [Double] and value 'null' cannot be used as an argument arg1:[NonGroup]..." I believe it is trying to tell you that it was looking for a Double value from the X field, but got a NULL value instead. Writing your events out to a JSON file or using the Sampler will assist in figuring out what the real value of the X field is. And RJ's points about extracting the information from the group elements using the 'dot' notation is a good one to try as well.  

kavi88
by
Occasional Contributor

Hey @RJSunderman , EricIronside,

Thank you for your insights and the explanation.

I routed event records emitted from the GEOM_CONSTRUCTION Field Calculator i configured to a JSON File using a dot notation to pull a single pair of latitude and longitude values our of the hierarchical data structure, using a Field Mapper to map the entirety of the data structure down to an event record whose GeoEvent Definition has exactly two Double attributes (lon, lat), in addition i calculated the eometry using the regular expression that was suggested by @EricIronside which resulted in having this kind of output :

{
"Train_id": "ad37a93f-63f9-4bb4-980b-405b2996299e",
"lon": 6.14244,
"lat": 46.21022,
"geom": {
"x": 6.14244,
"y": 46.21022,
"spatialReference": {
"wkid": 4326
}
}

And also this kind of output

{
"Train_id": "ad37a93f-63f9-4bb4-980b-405b2996299e",
"lon": null,
"lat": null,
"geom": null
}

Checking the Geoevent logs, i found this error :

Expression ['{ "x":' + lon + ', "y":' + lat + ', "spatialReference" : { "wkid" : 4326 } }'] evaluation failed: operator+(arg0:[NonGroup], arg1:[NonGroup]) evaluation failed because of GeoEvent field 'lon' of type [Double] and value 'null' cannot be used as an argument arg1:[NonGroup].

Which got me thinking that Geoevent is not capable of handling null values. Please correct me if I am wrong.

Thank you.

0 Kudos
RJSunderman
Esri Regular Contributor

Hello @kavi88 ... I would say that GeoEvent Server is able to handle null value input. Attribute values can be null and there should not be a runtime exception generated that creates a fault in event record processing. That doesn't mean that you'll be able to calculate a derivative value if the input values are null or if attribute values cannot be used in the expression you configure a Field Calculator to use.

Suppose you receive some simple event record like:
{ "myDouble": 3.14159, "scaleFactor": 3.1, "calcResult": null }

Field Calculator configured with an expression myDouble * scaleFactor will be able to write the value 9.738929 into an existing field calcResult.

But if one or more of the attribute fields contain null values:
{ "myDouble": 3.14159, "scaleFactor": null, "calcResult": null }

You should expect to see some sort of error. You cannot multiple a Double and a null value, or implicitly cast a null or a literal string to a numeric value to allow a Field Calculator to compute a value. We do try not to make up data in cases where invalid values are received. We wouldn't want, for example, to assume a location of 0.0 latitude / 0.0 longitude because lat and lon values pulled out of a data structure were null.

Suppose, rather than computing a Double value we were simply trying to place two Double values into a descriptive string. An expression like the following:
'My Double is: ' + myDouble + ' and my Scale Factor is: ' + scaleFactor + '.'

Written into a String attribute would calculate a value something like:
"My Double is: 3.14159 and my Scale Factor is: 3.1."

If a null value were received for the scaleFactor an error message like the following is logged:
Expression ['My Double is: ' + myDouble + ' and my Scale Factor is: ' + scaleFactor + '.'] evaluation failed: EVALUABLE_EVALUATION_FAILED_CAUSE

The error message above is what is produced at the 10.9.x release. It may be that Field Calculator is logging less readable error messages at an earlier release, which would explain why you are seeing messages talking about arg0:[NonGroup], arg1:[NonGroup]. I know we improved the error messages that Field Calculator was logging at some point, but I don't remember which s/w release has those changes. Regardless, if an expression uses attribute field(s) whose value(s) are null ... you should probably expect to see some sort of error logged and the computed result receive a null value.

The problem you are trying to solve has several different places where something can go wrong. I have frequently encountered, for example, data in a rich, complex hierarchical structure not necessarily being 100% homogenous across all of the levels in the hierarchy. It could easily be the case, for example, that the "impacted_objects" for a "disruption" do not have a "stop point" defined. It may be that there is no value at a hierarchical path disruptions[idx].impacted_objects[idx].impacted_stops[idx].stop_point.coord.lat or if an attribute exists at that level in the data structure, its value is null.

I would assume that after you use the serialized multicardinal field splitter processors to flatten out all of the levels in the data structure, you'll have to use a couple of filters to test whether valid lat and lon values can be retrieved and log a "disruption" identifier to a file when a "stop_point" location cannot be calculated rather than trying to calculate a string representation of a geometry using null values.

- RJ