hi, all,
I have implemented the custom GeoEvent - custom Transport and adapter- to handle the xml data from UNS (so called MUPs) server and verified all input data to the GeoEvent parsed correctly on my custom adapter code.
The GeoEvent service to test it, I set up Input connector (using custom transport/adapter I implement), and using Field Mapper to map between custom defn to the SDE feature server service layer fields, and then pass to the GeoEvent default Output connector for 1) Add Features 2) Update Features, 3) File-Out to test,
All seems to work fine, but randomly (not frequently though) - no consistent pattern, some data are missing on csv file and ms sql sde data base which is the backend of the feature server layer.
I used the mostly default setting up on Add a Feature output like Update Interval (1 seconds), Max Feature Per Transaction as 500, Generate flat Json (Yes), Formatted Json(no), Delete old features (no).
Is there anyway for me to check why randomly some data is missing on the output?
The data sources are about 150 cars and each one spits out the data every 10 seconds from UNS server by TCP.
When trace the data-In, sometimes few datas from different cars are merged, and my transport/adapter code parse them correctly and set GeoEvent on the adapter.
welcome any comment and help on this issue, and big thanks in advance,
Munhwan
Hello Munhwan -
Sorry that no one has gotten back in touch with you on your question. If you found a solution in the meantime and wanted to share it here, I'm sure the community would appreciate it.
If your concern were that features were not being added to a feature service, I would suggest that you set the DEBUG level on the com.esri.ges.transport.featureService.FeatureServiceOutboundTransport. From within GeoEvent Manager, navigate to Logs > Settings and specify the name of the component for which you want to increase the logging level.
A benefit of debug logging the Feature Service Outbound transport is that you'll see (in the karaf.log) all of the JSON transactions being made between GeoEvent and the ArcGIS Server REST interface used to add / update features.
One problem I've seen is that a feature service with a field of type esriFiedlTypeString specifies a length restriction of say 5 characters. Several events are received by GeoEvent, one of whose Name field (a String) is longer than the 5 character restriction. GeoEvent will successfully adapt the events received, batch them up and send out a REST request to add / update features once every second. ArcGIS Server receives the request, observes that one of the event records has a String whose length it cannot accept ... and rejects the transaction. The result is that none of the events in that transaction come though to update features.
You might find an indication of the rejection if you looked into the ArcGIS Server logs, but the only way you'd know from the GeoEvent karaf.log that there was a problem would be to look at DEBUG logging for the Feature Service Outbound transport and see the response received from the REST request GeoEvent made on the ArcGIS Server feature service's endpoint.
Now - you indicated in your message that you observed events missing from both the feature service and the CSV system file you were using to log the processed events. This I cannot explain. Obviously an issue with ArcGIS Server rejecting one or more events because they do not meet the requirements of the feature service's schema would have nothing to do with writing that same event data out to a system file. Completely different transports and adapters are being used to support these two outputs.
Were the event counts displayed in the GeoEvent Manager's Monitor page consistent? That is, did you receive 210 events In on an input, 420 events In/Out the GeoEvent Service and 210 events Out each your fs-out output and your text-csv-out output?
About all I can recommend - apart from attaching a debugger and stepping through the execution of your custom transport and adapter - is to incorporate a system file JSON or CSV output directly off of your input and a second system file JSON or CSV output (writing to a different file) downstream, after any filtering / processing you might be doing ... just before you output the event data to your fs-out to add/update features. Then you can compare the content of the two output files.
The data you see getting lost might be lost when an adapter fails to adapt a received payload of bytes, it might be discarded by a filter, it might be lost when a processor tries to process the data and throws an exception, or it might be lost when a REST transaction made by GeoEvent is rejected by the target server (as I describe above for adding / updating features).
- RJ
Thank you for your posting. I will try to apply your suggestion.
I changed the logging level to Debug, but not specifically which one is, and overwhelmed with too much logging to check. I will set to the DEBUG level on the com.esri.ges.transport.featureService.FeatureServiceOutboundTransport.
With several weeks after stopping that issue, now I can't start the GeoEvent Service with Error 1067 with any reason.
Do you have any suggestion?
No logging on D:\Program Files\ArcGIS\Server\GeoEvent\data\log at all.
cheers,
Munhwan
Resolved to start the GeoEvent service again from another thread ( GeoEvent Services Not running )
M
RJ,
>> Were the event counts displayed in the GeoEvent Manager's Monitor page consistent? That is, did you receive 210 events In on an input, 420 events In/Out the GeoEvent Service and 210 events Out eachyour fs-out output and your text-csv-out output?
YES they do.
I set the DEBUG level on the com.esri.ges.transport.featureService.FeatureServiceOutboundTransport.
So far, I can't find any exception, or abnormal output, but all says 'success...' with Json..
I will run it little more.
cheers,
Munhwan
Hi Munhwan,
How is your research going? Have you tried keeping a log of the records that should be getting passed through and seeing if you can find a pattern in the data by comparing with the records that did make it through? We had some luck doing this comparrison when things aren't getting sent through that should be. You might also want to look at the Site > GeoEvent Definition(s) generated for this service and compare with those records that didn't get passed through.
RJ recommended a debugging approach that I have found useful for situations similar to what you've described. That is, stepping through the execution by attaching a new Output (system file JSON or CSV ) directly off of your input and a second Output (system file JSON or CSV output, writing to a different file) attached after your business logic downstream. Run the service, compare how many records were output, move the second Output to another spot in your logic and see if you can pinpoint the processor/filter that's giving you trouble.
Good luck, Katie
Thank you for the feedbacks on this issue, Katie and RJ.
A PM stopped me to investigate this issue now since there are more data missing with mal-formatted data from UNC server to the GeoEvent. The first two bytes which tells the size of main data followed is used but sometimes the raw data for that size contains mal-formatted data so the custom Transport code rejected them before sending to the Adapter. The client wants to investigate those case first with the H/W company.
I will keep in mind your suggestion to check it out the missing data from Adapter (GeoEvent object) to the final output to AddFeature and CSV etc. Based on my first investigation it happened very rarely and randomly.
Munhwan
hi, Katie and RJ,
I think I found why there is the missing data happens, but I don't know how to fix it.
Could anyone help me re: Custom Inbound Adapter API?
The custom Inbound Adapter extending 'InboundAdapterBase' parsed ''ByteBuffer' from my custom transport object in the method 'adapt' which returns only one 'GeoEvent' object, while when 'ByteBuffer' may contains more information to create more than one GeoEvent object how to send out.
Inside FOR-Loop in 'Adapt' method, sometimes more than one GeoEvent object is generated, but only the last one returns to the GeoEvent service from the InboundAdapter. I think that is why the missing data happens.
I can't figure out easily based on the GeoEvent API document.
With analyzing the 'trimble-taip-adapter' sampel codes, I may consider to utilize 'Thread' along with geoEventListener, which I am confident it will resolve the issue I described above.
Appreciate any input about this in advance,
Munhwan