GeoEvent Extension 10.3.1 Input (JSON over TCP) stops working if the input receives more than one message per 100ms.

5758
14
Jump to solution
07-29-2015 04:05 AM
ThorstenBraun
New Contributor II

Maybe someone has an idea to find a workaround or solution for that problem:


The GeoEvent Extension 10.3.1 Input (JSON over TCP) stops working if the input
receives more than a message in ~ 100 ms. The Input geoevents are not dropped like it
appears to be in 10.2.2, instead the Input/TCP-Port stopps working at all.

Even restarting the Input Connector does not solve the problem. If the
Input connector stopped working once, no more data is received until the
complete GeoEvent Service has been restarted.

We had problems with remarkable packet loss in 10.2.2, so we decieded to use 10.3.1.

Now we are running into the problem described above that the incoming data is not processed

if the input stopped working once.

We reproduced the problem with a simple python tool that is sending JSON Data to the TCP Socket.

The test script sends always the identical JSON document. The JSON is about 1000 characters long.

The GeoEvent Service always processes some Input Events (5 to 20, it's always different), then the

Input stops working as described above.

Test Environment:

  • Windows:      Server 2012 R2 and Windows 7 Professional
  • Running         on a Hyper-V Virtual Machine with 16GB RAM and 4 Cores
  • Tested           on 3 different Machines (one with Portal federated the others blank ArcGIS Server and GEP)
  • ArcGIS           Versions 10.3.1
  • GEP               Connector: TCP JSON IN

The most Important aspect is, that the Input connector does not recover itself, instead it is totally
broken until the complete GEP is restarted. There is no chance to evaluate more
detailed where the problem could be. We did not find anything helpfull in the Logfiles.

It would be great if we could get some assistence or any hint what exactly the problem is and maybe how

to circumvent the Problem.

Unfortunately we are limited to TCP/JSON as Input, because this is the Interface to our customer.

1 Solution

Accepted Solutions
TimothyStiles
New Contributor III

Changes made for the 10.4 release appear to address this issue. A potential fix for the 10.3.1 release was delivered to a customer for verification. Other customers requiring this fix for the 10.3.1 release should work with their Technical Account Manager to submit a hot fix request. Please keep in mind that hot fixes are targeted releases and may not be applicable in all customer environments.

View solution in original post

0 Kudos
14 Replies
GianlucaCaporossi
New Contributor II

Hi

for your information

we use tcp / geomessages with peaks of more than 300 tracks per second without any problem.

Windows: Server 2008 R2

Running: on Amazon cloud machine with 8GB RAM and 2 Cores

ArcGIS: Versions 10.3.1

The tests are made using a machine for sending tracks and one for receiving the geoevent.

The use of a single machine (sending and receiving) slows down performance by far.

I hope this information will be useful for you.

if you want I can test your file on my server.

Ciao

Gianluca

0 Kudos
ThorstenBraun
New Contributor II

Hi,

I'm using the following configuration for the geo Event Input:

geoevent_input.PNG

# Python script to reproduce the Problem.

import socket
import json
import time

# JSON Message that will be send as single object

for j in range(10):
s = socket.socket(socket.AF_INET, socket.SOCK_STREAM)
print "connecting.." + str(j)
s.connect(('127.0.0.1', 15057)) #

#sending the message 20 times
for i in range(1,21):
    print "send data.. " + str(i*j)
    s.send(json.dumps([{'sampleString': '1234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678902345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890' + str(i*j)}]))
    time.sleep(0.1)  # remove this line to break down geo event input
   
s.shutdown(1) 
s.close()

Exit

0 Kudos
GianlucaCaporossi
New Contributor II

I tested on my machine

in: genericjson / tcp

Out: out-csv file

remove the line time.sleep (0.1)

and sent 10,000 json

everything works without problems

G.

0 Kudos
ThorstenBraun
New Contributor II

My Input was configured like this:

geoevent_input.PNG

0 Kudos
GianlucaCaporossi
New Contributor II

json1.PNGjson2.PNGjson3.PNGjson4.PNG

0 Kudos
ThorstenBraun
New Contributor II

Hi,

strange. In my opintion there must be a difference somehow. We were able to reproduce this on several machines.

- Which operating system are you testing on?

- What is the locale of your OS?

- Are you sending to localhost or from another machine to your geoevent Service?

- Do you have any special configuration for geoevent extension?

- Is there a patch for 10.3.1 which I didn't notice yet?

# here is another script which breaks down the Input.

# -*- coding: cp1252 -*-
import socket
import json
import time

#sending the message 20 times
for i in range(100):
   s = socket.socket(socket.AF_INET, socket.SOCK_STREAM)
   s.connect(('127.0.0.1', 15057)) #
  
   j = 0
   n = 3
   while j < n:
      print "send data.. " + str(i)
      s.send(json.dumps({'sampleString': '123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890123456789012345678901234567890' + str(i*10+j)}))  
      j = j + 1
  
  
   s.shutdown(1)
   s.close()
   #time.sleep(1)  

exit

0 Kudos
GianlucaCaporossi
New Contributor II

- Which operating system are you testing on?  Windows server 2008 R2 Service Pack 1

- What is the locale of your OS? Cloud Amazon

- Are you sending to localhost or from another machine to your geoevent Service? I tested in both ways

- Do you have any special configuration for geoevent extension? NO

- Is there a patch for 10.3.1 which I didn't notice yet? NO

I also tested the new script that you sent me and I can confirm that it works without problems

Attached are the file export of input,export and service used.

0 Kudos
XuehanJing
New Contributor III

Hello all,

Same issue has been tested and reproduced in Technical support. We have already logged bug for it. Out development team also reproduced the issue and was currently working on this issue. The bug id BUG-000 89145 is for your reference.

Thanks.

0 Kudos
XuehanJing
New Contributor III

Sorry the bug number is BUG-000089145.

0 Kudos