Output record rejected if JSON text field size exceeds feature class text size.

2439
2
01-27-2014 12:07 PM
DennisGeasan
Occasional Contributor II
FYI - I've discovered that if the content of an incoming JSON text field exceeds the text field size of the destination feature class attribute field, the entire JSON record is rejected.  There appears to be no warning of this occurring within GEP.  The GEP monitor will indicate that all records from the input have also passed thru the GE Service and the GEP output.  However, the records will not get added or updated in the target feature class.  It also happens with integer numbers.  If the input integer value is larger than the 32bit range allowed in the Geodatabase then the record is rejected.  It would be handy if a GEP output could detect this situation and produce an error log file listing rejected records with an explanation.
DG
0 Kudos
2 Replies
RJSunderman
Esri Regular Contributor
Hey Dennis -

We've taken note of the limitation with regard to event data not making it into the intended feature class due to incompatibles with the feature class"?s properties. We"?ll see what we can do to have GEP generate some sort of warning or error message "�

What you are experiencing, though, is consistent with the design of Outputs in general. The GeoEvent Service will send a GeoEvent to an Output, and if the Output is able to use its adapter and transport to successfully dispatch the event, then it"?s done its job. If the XMPP Server, Email daemon, Web Server, ArcGIS for Server (etc.) are unavailable it"?s like you posting a letter but the Post Office failing to deliver it. GEP has done what it can do, but it cannot guarantee that an Instant Message or Email (for example) was successfully delivered.

There are steps you can take to help insure event data is prepared for receipt by a feature service. If you suspect it likely that a data provider will be sending integer values which exceed the 32-bit range of an esriFieldTypeInteger (which corresponds to a SQLServer signed int) "� since GEP supports 64-bit long integer values "� you could configure a filter to direct values which won"?t survive entry into the feature service to send the event data to a system text file. Pairing the filter with a field calculator, you could even enrich the event with your own error text so that, reading the text file, you would know which events didn"?t get created as features (and why).

Similarly with strings whose length is too long for your feature layer"?s field "� you could use a regular expression such as ^.{0,32} in a Field Calculator (Regular Expression) to truncate strings to a fixed length (in this case 32 characters). This will become easier with the string functions being supported in the regular Field Calculator with the 10.2.2 product release.

I think I"?ll work that example into the Intro Tutorial "� using a regular expression to trim a string"?s length. I kind of like that "�

Hope this helps -
RJ
0 Kudos
DennisGeasan
Occasional Contributor II
Hi RJ,
Based on the general design it makes sense that there are no warnings and perhaps the most common use of GEP involves just a very few data fields in the data stream.  Most of the data streams I'm working with have many attribute fields so what you suggest for controlling content is a solution but possibly rather tedious and difficult to manage.

For the 'add' and 'update' outputs - at the Geodatabase level (target feature classes) I think there would be an error generated for violating the field constraints.  Would or can that error bubble up to the ArcGIS Server service?  If so then maybe you would listen for that on the GEP 'add' and 'update' outputs.  Just knowing that records are failing to be updated/inserted would be sufficient.

As for using regular expressions or string functions  - how about providing a processor that combines field mapping with "on-the-fly" field processing?  That way in one form you could map each field to its own calculation and direct the result to the output schema field. One additional column in the present field mapper form.  Empty field processing values would of course mean no processing on the field.  Field "calcs" in this form would not be as rich as the stand alone field calculator (ie - no summations, etc) but would provide the opportunity for on-the-fly data formatting.

Thanks for the reply.
DG
0 Kudos