GeoEvent: Debug Update Features Output

Blog Post created by eironside-esristaff Employee on Nov 15, 2019

This article discusses a targeted way to debug erroneous records created by GeoEvent in a hosted feature layer.  For more general details on configuring and using GeoEvent logging system, please see RJ Sunderman's series of blogs on the topic: Debug Techniques - Configuring the application logger



The Situation

We have a GeoEvent Service that is reading in events from a hosted feature layer, attempting to enrich the data, filtering any data that didn't get enriched, and writing them out to another hosted feature layer:

The filter is attempting to pass only events that have a valid session_id

Obviously, it is expected that the data table in the target hosted feature layer will not contain any features with a NULL session_id.  Yet, when checked there are occaisionaly records that get through the filter.  



Follow the steps below to debug this issue:

  1. In the configuration file C:\Program Files\ArcGIS\Server\GeoEvent\etc\org.ops4j.pax.logging.cfg
    1. Set the log4j2.appender.rolling.strategy.max to something large like 200
  2. In GeoEvent Manager
    1. Create a new GeoEvent Input Poll an ArcGIS Server for Features that reads records from your target Hosted Feature Layer (the feature layer that is getting records with null that you don't want to see)
      1. Be sure to set the where clause to monitor for the error condition like session_id is null
    2. Create a new GeoEvent Output Send an Email that will send a notice to you that an erroneous record has been stored in the Hosted Feature Layer
      1. Add your email to the Sender and Recipient list
      2. Set the message format to HTML
      3. In the body, you should consider adding the important fields from the failure event like SessionID: ${session_id}
      4. Include a link to the REST end point query method so you can quickly evaluate the invalid record(s) like the following: 
    3. Create a new GeoEvent Service called “Monitor for Errors
      1. Add your Feature Service Poll input and Email output from above then connect them up
      2. Press the Publish button
    4. Create a new GeoEvent Output Write to a JSON File to write data to disk
      1. Name it file-json-out-before
      2. Set the Filename Prefix: before
      3. Modify other settings as needed
    5. Create a new GeoEvent Output Write to a JSON File to write data to disk
      1. Name it file-json-out-after
      2. Set the Filename Prefix: after
      3. Modify other settings as needed
    6. In your existing GeoEvent Service that is currently processing data
      1. Place the file-json-out-before before your final filter
      2. Place the file-json-out-after after your final filter
      3. Press the Publish button
    7. Set the LOG LEVEL on the following loggers to TRACE
      • com.esri.ges.transport.featureService.FeatureServiceOutboundTransport

      • com.esri.ges.httpclient.http
    8. Wait for an email


While You Wait...

While you are waiting for an issue to occur, keep an eye on your disk space/usage for the files being written out:

  1. Periodically delete old karaf#.log files in your C:\Program Files\ArcGIS\Server\GeoEvent\data\log directory.
  2. Periodically delete old before#.json and after#.json files in the output directory you specified.


You've Got Mail!

Once you get an email indicating an error has occured:

  1. Stop all GeoEvent Inputs, Outputs, and Services if you can to make sorting out the logs easier (or stop the GeoEvent Windows Service while you collect the files).
  2. Collect the files from the following locations. You are looking for files that contain data before and after the timestamp on the email.
    1. C:\Program Files\ArcGIS\Server\GeoEvent\data\log\karaf#.log
    2. <Your JSON Output File Directory>\before#.json
    3. <Your JSON Output File Directory>\after#.json
    4. You will want to collect logs from your ArcGIS Server during this time as well.
  3. If the failure record contains any sort of identifiable information you can use that to locate the event in the files above.
    • If not, you may have to resort to sorting through the timestamps until you find the exact time the event went through the system.
    • See RJ's blogs mentioned at the top of this page for tips/tricks on analyzing the log files.