Select to view content in your preferred language

Error when outputting, "Malformed data. Length is negative: -40 "

6570
3
10-24-2018 09:58 AM
NathanKoski
Regular Contributor

Hi there,

When my service goes to output using the update feature process I am constantly seeing a series of errors that come from com.esri.ges.messaging.jms.GeoEventBytesEncoder, the most common are:

  • "null java.lang.NullPointerException"
  • "An unexpected error occurred while attempting to serialize the GeoEvent to a Byte Array. Error: null."
  • "Malformed data. Length is negative: -40 org.apache.avro.AvroRuntimeException: Malformed data. Length is negative: -40 at org.apache.avro.io.BinaryDecoder.doReadBytes(BinaryDecoder.java:336) [18:avro:1.8.2]"
  • "An unexpected error occurred while attempting to serialize the GeoEvent to a Byte Array. Error: Malformed data. Length is negative: -40." (this error is coming as an info type in the log when I have it set to debug, full error is in a screen shot below)

I cannot figure out what is causing them at all. During some research, I found that similar errors seem to come from using nesting geoevent definitions, but I am not using nested.

The service outputs into a multipoint and a polygon, both give this error. However, it does not seem to have trouble outputting into a CSV file. Inside the CSV file the data looks perfect and shows no problems. I have tried deleting and manually remaking the definition with did not help. The error is not constantly happening, it seems to come and go.

I was curious if anyone has experienced this before. The one that is really confusing me is the data length -40 because I cannot figure out how that is possible.RJ Sunderman

Thanks!

0 Kudos
3 Replies
RJSunderman
Esri Regular Contributor

Hello Nathan -

I've not seen the error you are reporting before. If the inbound connector (input) is able to receive and adapt data to create an event record, and these event records process through a GeoEvent Service, and data can be logged as text to a CSV file ... then my understanding is that event records have undergone several Avro serialization / deserialization cycles.

We cannot tell, from the screen capture you provided, whether the INFO message being logged by the ges.messaging.jms encoder is in response to a problem encountered by an inbound connector's transport or adapter, if the message is being logged because a processor you've configured is unable to handle event records it has received, or if -- as you suggest -- the error is coming from a failure to add or update feature records through a feature service.

My first step would be to remove any processors or filters from my GeoEvent Service and see if I can successfully ingest event records and log their data as JSON in a system file. I prefer the Write to a JSON File output as the JSON format supports hierarchy and multicardinality that delimited text (e.g. CSV) does not. Also, you’re probably aware, that event records must be “simplified” to a flat structure without any hierarchy or multicardinality before the event records can be sent to an output tasked with adding or updating feature records through a feature service. This step, I think, will help us figure out whether all of the data being sent to GeoEvent Server is being processed through to an output, or whether some portion of the data is being rejected on the inbound side.

Please open an incident with Esri Technical Support so that an analyst can be assigned to work with you and track the steps being taken to address the issue. If this ends up being a bug in an input, processor, or output where a certain type of data is not being handled property, the product team will need information from technical support before we start work to identify a root cause.

Best Regards –

RJ

NathanKoski
Regular Contributor

Hi RJ Sunderman

Working with your advice I removed all of the filters/processors from the service. Doing so seems to result is no errors, however, I was mistaken because the errors do happen even if dumping into a csv file. 

Working under the idea that the original data is completely intact I decided to then to test each processor individually. I learned that when I use the processor Envelope Creator and then use the Buffer on the same data the errors appear. All of the other workflow paths are fine and leave no errors, but when I try and pass my data through Envelope Creator then buffer the envelope it breaks. There is a screenshot attached that shows the broken workflow for you better. Replacing Envelope Creator with Convex Hull Creator fixes the problem.

It seems like something is going wrong with how those two processors interact with each other. Is it worth still opening a ticket, or perhaps is there someplace better to submit a formal bug report? The odd thing is that overall the enveloped data seems to be rendering on the map more or less fine...

For now, I will simply just use the Multipoint > Convex Hull > Buffer instead as it gains me similar results and no errors.

Thanks for your fast replies!

Nate

0 Kudos
RJSunderman
Esri Regular Contributor

Hey Nate,

Yes, if you would please submit a technical support incident, that will help get a bug report formally documented. If you have JSON data we can send to a receive JSON input that shows reproducability, that would also be a huge help.

- RJ