GeoEvent Processor Add Feature - Date Problems Deleting old features

3251
1
10-08-2015 12:51 PM
JonathanHouck
New Contributor III

Hi everyone,

I've been working with the GeoEvent processor for the past few days and am trying to establish a historical record of data that's coming in through one of our inputs.  Currently for one of our GeoEvent Services we have two update outputs and a text output that are working flawlessly.  I've been running into major issues with respect to the date field in the Add Feature output. 

At first, the date field in the Add Features output wouldn't populate at all, despite the field's name matching the service object.  I updated the expected date property on the input, and it started working.  It was in the wrong time zone, so I applied a processor to field calculate the date field -25200000 milliseconds to adjust for the time zone we're in, and that was also working.

At 4AM this morning, everything stopped working.  The feature class had hit ~85,000 rows and nothing else had been written.  I had the "delete old features" option enabled, set to twelve hours, (checking every hour) and the features stopped updating when that should have kicked in. 

I reconfigured the environment with a fresh copy of the feature class I was writing to, and made it delete everything an hour old.  It worked for an hour, and then I started getting these messages:

Failed to parse the time value '2015-10-08T19:30:00Z' using the custom format. Using the default date format. java.text.ParseException: Unparseable date: "2015-10-08T19:30:00Z" at java.text.DateFormat.parse(DateFormat.java:357)[:1.7.0_65] at com.esri.ges.adapter.json.FeatureJsonInboundAdapter.parseFeature(FeatureJsonInboundAdapter.java:317)[258:com.esri.ges.framework.adapter.feature-json-adapter:10.3.0] at com.esri.ges.adapter.json.FeatureJsonInboundAdapter.receive(FeatureJsonInboundAdapter.java:164)[258:com.esri.ges.framework.adapter.feature-json-adapter:10.3.0] at com.esri.ges.manager.stream.internal.InboundAdapterProxy.receive(InboundAdapterProxy.java:39)[325:com.esri.ges.manager.internal-streammanager:10.3.0] at com.esri.ges.manager.stream.internal.InboundStreamImpl$DataProcessor.run(InboundStreamImpl.java:79)[325:com.esri.ges.manager.internal-streammanager:10.3.0]

After which, not only will the old features no longer be deleted, but no new features will be added to the feature class.  My expected date format is: dd/MM/yyyy hh:mm:ss

Environment is Windows Server 2008R2, ArcGIS Server 10.3, and SDE database is Oracle 11g.

Any advice of things to try would be greatly appreciated, and I'd be happy to provide any further info.  Thank you!

Jon

0 Kudos
1 Reply
DennisGeasan
Occasional Contributor II

Are you using an "Add Feature" output or an "Update Feature" output? 

If using an "Add Feature" output for historical data collection and you turn on the delete option I think what will happen is GEE will start deleting all records older than "Maximum Feature Age".  So it may be the 85000 record limit you are observing is actually the most recent 85000 records.  The parameter "Unique Feature Identifier Field" is not present in the "Add Feature" output so GEE has only the 'datetime' field to use for "search and destroy". 

For an "Update Feature" output the delete option deletes records based on the 'datetime' field and the  "Unique Feature Identifier Field".  Plus, this output is not an append action like the "Add Feature".  The intention is to update field values on a table of records uniquely defined by two fields ("Unique Feature Identifier Field" and the 'datetime' field).

Is there a reason you need to delete records from a historical data collection?  If so you would have to use an "Update Feature" output and you will have to do work in the input to insure each value targeted for the  "Unique Feature Identifier Field" of the output is unique.  When it is time to delete an existing record then do not update the 'datetime' field for that record.  HOWEVER, this will prove to be very compute intensive over time if the target feature layer gets large.  The "Update Feature" output would need to update the 'datetime' field of all records that are not intended for deletion and do this for each output event.  If you are streaming lots of data at a fast rate I'm not sure if GEE could keep up.

DG

0 Kudos