Select to view content in your preferred language

Ability to Add a Field to Big Data Store via GeoEvent Server or Data Store

06-27-2018 06:40 AM
Status: Closed
MVP Regular Contributor

When modifying a service within GeoEvent Server Manager such that a new field gets added by way of the Field Calculator Processor, the new field does not get automatically added to the data source when using the Spatiotemporal Big Data Store.  Right now, the only way to get an Output to recognize the new field is to actually re-create the data source itself under the Site settings.  This is problematic because of the fact that the associated hosted map and feature services using the original data store end up being deleted and re-created during that process.  If those services are Portal Items which specific URLs used downstream as layers within a web map, this becomes problematic on a large scale because all of those web maps will need to be changed to accommodate the new Portal item.  Additionally, deleting and re-creating the data source is disruptive to users consuming the Output. 

I was thinking that adding the field manually to the GeoEvent Definition behind the Output might solve this issue, however doing this gives displays an error indicating that the field exists in the GeoEvent Definition but not in the data source; therefore the configuration is incompatible:

Please introduce the capability to either dynamically add the field to the data store when re-saving the GeoEvent Definition or somehow manually add the field via the Data Store endpoint.  


Good news to report on this one William Craft‌ ... I believe the ability to add new attribute fields to an existing spatiotemporal big data Data Source is an enhancement coming with the 10.6.1 release.

I'm going to tag Qingying Wu‌ in this reply. She should be able to either comment here, or write-up a GeoEvent Server Blog which will show you how to do this. Stay tuned!

- RJ


Did this ever happen? Can't find a howto anywhere.

In a 10.6.1 test environment I've successfully added a new field to the datasource using Pro, the schema of the assosiated feature service was updated instantly, and after an "Edit Map service" the mapservice honored the new schema too. 

The new field are recognized by GeoEvent, and I'm able to set up an "Update a feature in a Spatiotemporal Big Data Store" output connector to the data source.

BUT - When writing to the BDS it seems there are an issue with some underlying schema. In the GeoEvent log we see:

Unexpected Failures when writing to the Spatiotemporal Big Data Store for data source "AMIDeviceBDS": 
[Id: "733639", Error: "Map(type -> mapper_parsing_exception,
 reason -> failed to parse, caused_by -> Map(type -> illegal_argument_exception,
 reason -> illegal latitude value [90.0] for 102100 [-89.0,89.0]))"]

And in the elastic log files we see:

[2020-03-10T10:20:01,447][WARN ][c.e.a.b.a.esrigeohash ] illegal latitude value [90.0] for 102100 [-89.0,89.0]
[2020-03-10T10:20:01,447][DEBUG][o.e.a.b.TransportShardBulkAction] [] [a493b4f5-94b8-4e0c-adae-ffa94b0cc00b_es5-5-0_20200310]
   [0] failed to execute bulk item (index) BulkShardRequest [[a493b4f5-94b8-4e0c-adae-ffa94b0cc00b_es5-5-0_20200310][0]] containing [index {[a493b4f5-94b8-4e0c-adae-ffa94b0cc00b_es5-5-0_20200310 [AMISite][AXDDvmunDy-iZVInbl-9],
org.elasticsearch.index.mapper.MapperParsingException: failed to parse
 at org.elasticsearch.index.mapper.DocumentParser.wrapInMapperParsingException( ~[elasticsearch-5.5.0.jar:5.5.0]
 at org.elasticsearch.index.mapper.DocumentParser.parseDocument( ~[elasticsearch-5.5.0.jar:5.5.0]
 at org.elasticsearch.index.mapper.DocumentMapper.parse(DocumentMapper.....

Which does not make too much sense. Is there a workaround, or do we still have too recreate the Big DataStore?


I have the same question. I wasn't able to find any documentation on how to go about adding a field to a service in the big data store. I have only been able to achieve this by deleting and recreating the service. Is there a way to do this without needing to delete the service, since that leads to outages on our end? Thanks!

Status changed to: Closed

Closing this in the ArcGIS Pro idea exchange as it is a duplicate of this idea in the Enterprise idea exchange: