Select to view content in your preferred language

Ability to Add a Field to Big Data Store via GeoEvent Server or Data Store

1571
4
06-27-2018 06:40 AM
Status: Closed
WilliamCraft
MVP Regular Contributor

When modifying a service within GeoEvent Server Manager such that a new field gets added by way of the Field Calculator Processor, the new field does not get automatically added to the data source when using the Spatiotemporal Big Data Store.  Right now, the only way to get an Output to recognize the new field is to actually re-create the data source itself under the Site settings.  This is problematic because of the fact that the associated hosted map and feature services using the original data store end up being deleted and re-created during that process.  If those services are Portal Items which specific URLs used downstream as layers within a web map, this becomes problematic on a large scale because all of those web maps will need to be changed to accommodate the new Portal item.  Additionally, deleting and re-creating the data source is disruptive to users consuming the Output. 

I was thinking that adding the field manually to the GeoEvent Definition behind the Output might solve this issue, however doing this gives displays an error indicating that the field exists in the GeoEvent Definition but not in the data source; therefore the configuration is incompatible:

Please introduce the capability to either dynamically add the field to the data store when re-saving the GeoEvent Definition or somehow manually add the field via the Data Store endpoint.  

4 Comments
RJSunderman

Good news to report on this one William Craft‌ ... I believe the ability to add new attribute fields to an existing spatiotemporal big data Data Source is an enhancement coming with the 10.6.1 release.

I'm going to tag Qingying Wu‌ in this reply. She should be able to either comment here, or write-up a GeoEvent Server Blog which will show you how to do this. Stay tuned!

- RJ

HåkonDreyer

Did this ever happen? Can't find a howto anywhere.

In a 10.6.1 test environment I've successfully added a new field to the datasource using Pro, the schema of the assosiated feature service was updated instantly, and after an "Edit Map service" the mapservice honored the new schema too. 

The new field are recognized by GeoEvent, and I'm able to set up an "Update a feature in a Spatiotemporal Big Data Store" output connector to the data source.

BUT - When writing to the BDS it seems there are an issue with some underlying schema. In the GeoEvent log we see:

com.esri.ges.transport.bds.SaveToBDSWorker: 
Unexpected Failures when writing to the Spatiotemporal Big Data Store for data source "AMIDeviceBDS": 
[Id: "733639", Error: "Map(type -> mapper_parsing_exception,
 reason -> failed to parse, caused_by -> Map(type -> illegal_argument_exception,
 reason -> illegal latitude value [90.0] for 102100 [-89.0,89.0]))"]

And in the elastic log files we see:

[2020-03-10T10:20:01,447][WARN ][c.e.a.b.a.esrigeohash ] illegal latitude value [90.0] for 102100 [-89.0,89.0]
[2020-03-10T10:20:01,447][DEBUG][o.e.a.b.TransportShardBulkAction] [HAARCBIGDS01T01.xxx.xx] [a493b4f5-94b8-4e0c-adae-ffa94b0cc00b_es5-5-0_20200310]
   [0] failed to execute bulk item (index) BulkShardRequest [[a493b4f5-94b8-4e0c-adae-ffa94b0cc00b_es5-5-0_20200310][0]] containing [index {[a493b4f5-94b8-4e0c-adae-ffa94b0cc00b_es5-5-0_20200310 [AMISite][AXDDvmunDy-iZVInbl-9],
       source[{"receivedtime":1583832001205,"statusgridconnection":"operational",
               "---geo_hash---":[597242.2,6643372.5],
               "kilde":0,"---timestamp---":1583832001207,
               "xcoord":597242.2,"objectid":365058,
               "globalid":"{BD9802F5-7027-3AC6-A65D-62DB31C61AEB}",
               "meteringpointid":"707057500051554942","substation":"4821",
               "ycoord":6643372.5,"lon":10.739134,
               "geometry":[597242.2,6643372.5],
               "lat":59.916344}]}]
org.elasticsearch.index.mapper.MapperParsingException: failed to parse
 at org.elasticsearch.index.mapper.DocumentParser.wrapInMapperParsingException(DocumentParser.java:176) ~[elasticsearch-5.5.0.jar:5.5.0]
 at org.elasticsearch.index.mapper.DocumentParser.parseDocument(DocumentParser.java:69) ~[elasticsearch-5.5.0.jar:5.5.0]
 at org.elasticsearch.index.mapper.DocumentMapper.parse(DocumentMapper.....

Which does not make too much sense. Is there a workaround, or do we still have too recreate the Big DataStore?

bberry
by

I have the same question. I wasn't able to find any documentation on how to go about adding a field to a service in the big data store. I have only been able to achieve this by deleting and recreating the service. Is there a way to do this without needing to delete the service, since that leads to outages on our end? Thanks!

KoryKramer
Status changed to: Closed

Closing this in the ArcGIS Pro idea exchange as it is a duplicate of this idea in the Enterprise idea exchange: https://community.esri.com/t5/arcgis-enterprise-ideas/ability-to-add-a-field-to-big-data-store-via/i...