POST
|
Hello Walter -- The short answer, I think, is that GeoEvent Server displays Java data types in its pick list: When you configure a GeoEvent Definition you choose a data type for an event record attribute from one of the available types above. The esriFieldTypeXXX date types come into play when looking at an ArcGIS Enterprise hosted feature layer or feature service which stores feature records in an enterprise geodatabase. The .NET API help topic lists the supported data types for feature record attributes. Several of these GeoEvent Server does not support (like esriFieldTypeBlob). Others, like esriFieldTypeInteger, are compatible with a GeoEvent Definition specifying the ‘Java’ data type Integer. It can get a little tricky trying to keep the differences between the 64-bit Java data types used by GeoEvent Server and the 32-bit data types used by other parts of the ArcGIS Enterprise straight. Luckily, date/time values are always compatible when using a ‘Java’ type Date and ArcGIS Enterprise esriFieldTypeDate data type. So long as the event records you route to a Send Features to a Stream Service output (for example) have date/time values mapped as Date then the outbound adapter should be able to construct a data record formatted as Esri Feature JSON representing the date/time as an epoch long integer value (a 13-digit integer representing milliseconds since the Unix Epoch). That is what you should see in the ArcGIS REST Services Directory when subscribing to a stream service to see feature records the service broadcasts. It is also what you see when querying a feature service “at REST” using the query endpoint exposed by the feature service. What you see when looking at a pop-up in a web mapping application is up to the web application -- it probably is going to represent the epoch value it gets from the database as a more human readable string, and might even do you the favor of shifting the value from UTC to the local time of your client/server machine. I’m not familiar with the GeoEvent Server component logger StreamService-IRIS-delete captured in your screenshot. I would want to take a look at how the event processing upstream from the Send Features to a Stream Service output has been configured. I’m just guessing, but maybe by the time the event record reaches the output the data type for unit-activitydatetime has changed? Maybe it is a String or a Long integer value rather than a Date? The product team does rely on Esri Tech Support to work through these type of issues with customers, so I would encourage you to open an incident with Esri Tech Support so that an analyst can be assigned to investigate the issue and assist with reproducibility (if necessary). If you think it necessary you can request a GeoEvent Server specialist be assigned when you open an incident with Esri Technical Support. Hope this information helps -- RJ
... View more
12-16-2021
11:10 AM
|
1
|
0
|
762
|
POST
|
@NathanKoski -- I believe the issue you are running into is related to a couple of bugs specific to using GeoTagger processors configured in parallel as you show in your illustration. A fix was included in the latest 10.8.1 patch which addresses one of the issues (when writing enrichment / tagged values to existing fields). That issue has also been addressed in the 10.9 / 10.9.1 release. If you try to use a pair of GeoTagger processors working in parallel to enrich event records as tracked assets enter and leave geofenced areas -- and configure both processors to write the enrichment / tagged values to new fields however -- there is a risk that the creation of new GeoEvent Definitions will cause an exception which in turn causes a loss of connection between GeoEvent Server and the GeoEvent Gateway. The only recovery if this happens is to stop and restart the GeoEvent Gateway service. This issue has not yet been addressed. Two suggestions to work around the problem are: Apply the latest 10.8.1 patch or upgrade to the 10.9.1 release. Add a field to a GeoEvent Definition you own/author and then configure the GeoTagger processor(s) to write their values to an existing field. You will need to use a Field Mapper in front of the GeoTagger to prepare the event record schema so that records routed to the GeoTagger have the empty/null field in their schema which the GeoTagger processor can write to. Do not configure two GeoTagger processors to work in parallel. Configure separate GeoEvent Services each with a single GeoTagger processor and allow that single processor to create a new GeoEvent Definition as needed when writing to new attribute fields. The first is the preferred option. If you continue to have problems, please open an incident with Esri Technical Support and refer to the following two bug reports: BUG-000142170, and BUG-000132928 @DanWade @RichardsonAluvilayil
... View more
12-10-2021
03:18 PM
|
0
|
0
|
640
|
IDEA
|
A new document has been added to the GeoEvent Server Documents board which illustrates how to add new attribute fields to a BDS Data Source’s map or feature service schema. This can be done using either the Enterprise portal content item manager or ArcGIS Pro to update the schema of an existing hosted feature layer. Please note that while new attribute fields can be added to an existing data source you are not currently able to remove or delete attribute fields from an existing hosted feature layer when data is stored within a spatiotemporal big data store. Click to jump to the article on the GeoEvent Server Documents board. @EricIronside @yujingwu1
... View more
12-10-2021
02:39 PM
|
0
|
0
|
517
|
DOC
|
A new write-up is available illustrating the steps to add new attribute fields to a BDS Data Source’s map or feature service schema. This can be done using either the Enterprise portal content item manager or ArcGIS Pro to update the schema of an existing hosted feature layer. Please note that while new attribute fields can be added to an existing data source you are not currently able to remove or delete attribute fields from an existing hosted feature layer when data is stored within a spatiotemporal big data store. Please download the PDF attached to this article. @EricIronside @yujingwu1
... View more
12-10-2021
02:32 PM
|
0
|
0
|
823
|
POST
|
Hello Walter -- Up front, I'm going to ask that you open an incident with Esri Tech Support to get a specialist assigned who can help reproduce the issue on our end. This is for traceability in case the behavior you are seeing is a bug which needs to be addressed by the development team (as opposed to something we can configure in your deployment or environment that solves the issue for you). A couple of things I know about stream services: There is only one web socket running for each stream service output. This web socket is run as part of the GeoEvent Server's JVM. When multiple clients subscribe concurrently to the web socket they all suffer a deterioration in performance. I do not know what the upper limit is on how many clients can concurrently subscribe to a single stream service - only that there is little you can do to scale out the stream service for many hundreds or thousands of subscribers. You cannot, for example, configure more than one web socket resource to run and support a stream service you expect to experience a high volume of subscribers. There are four component loggers for stream services which you might try setting to log DEBUG messages to see if you can capture an error or warning around the time the stream service stops broadcasting data. com.esri.ges.datastore.agsconnection.NewStreamServiceBuilder com.esri.ges.framework.streamservices.client.StreamServiceClientImpl com.esri.ges.framework.streamservices.client.AGSConnectionStatusListenerStreamServiceClient com.esri.ges.transport.streamService.StreamServiceOutboundTransport I do not know off hand how verbose these component loggers are relative to data ingest volume / velocity. If you request DEBUG logging, monitor your karaf.log found beneath ...\ArcGIS\Server\GeoEvent\data\log to gauge how fast the log file is growing and log file rollover. If the logging is too verbose you could easily miss the data you are trying to capture as the current karaf.log fills-up, rolls over, and older rolled log files are deleted. As one stream service appears to be working reliability and the other seems to be failing every day or so, is there anything obviously different between the two? Are both stream services broadcasting roughly the same number of Point (my assumption ?) feature records per second? Do both stream services have roughly the same number of concurrent subscribers? You indicate that the problem does not manifest when the same stream service is run on a second machine, but then, there isn't anyone subscribing to that instance of the stream service. I know that Esri Tech Support has a similar case / incident they are helping the development team reproduce -- not specifically stream service related, but related to event processing stopping after a variable length of time (reportedly 2 - 5 days). To briefly describe some underlying GeoEvent Server architecture for you, there are two Kafka topic queues working with each GeoEvent Service. The first routes event records from an input (the producer) to a GeoEvent Service for processing. The second routes processed event records from the GeoEvent Service to an output (a consumer). If something were to go wrong with the Kafka event broker I might expect an input would continue to ingest and adapt data to create event records, but those event records would not be processed by a GeoEvent Service -- you would see the event count of the input increase as data was received, but the 'In' / 'Out' event count of the GeoEvent Service in which that input was incorporated would not increment. Or, if the second Kafka topic had failed, you might see event data being processed and the 'Out' event count for a GeoEvent Service incrementing, but the stream service output's event count would not increment. Knowing which of these you are observing (if either applies) will help Esri Technical Support better address an incident you submit with them. To help Esri Technical Support we will need you to capture a sample of the radio device or iPad device data being sent to GeoEvent Server. It doesn't look like there is anything "stateful" in your GeoEvent Service (e.g. I don't see an Incident Detector or any particular filtering logic) which might prevent us from running the same few hundred event records in a loop through GeoEvent Server for 48 hours. If you have reason to believe that we need more temporally consistent / contiguous data, we'll need your help to get a snapshot of that so that we can run the system for 48 hours to see if we can recreate the failure in-house. Please also create an XML snapshot of your configuration using the 'selective' export option to export just the GeoEvent Service whose stream service output is having a problem. This will simplify the XML snapshot by not including inputs, outputs, GeoEvent Definitions, or other configurable elements not shown in the relatively simple GeoEvent Service you illustrated above. Hope this information helps -- RJ
... View more
11-16-2021
07:17 PM
|
0
|
1
|
594
|
BLOG
|
Hey Adam -- Thank you for highlighting these new server-side (datesInUnknownTimeZone) and client-side (timeReferenceUnknownClient) properties. Checking with the development team, I was told that service publishers, beginning with the 10.9 release, can now set datesInUnknownTimeZone to TRUE when publishing a feature service (the default for this property is FALSE). If an application sends a query, addFeatures, or updateFeatures request and does not explicitly set the timeReferenceUnknownClient query parameter to TRUE in the request -- and the service was published with its datesInUnknownTimeZone parameter set TRUE -- the client will receive an HTTP error code response. This is to protect client applications querying newer 10.9(+) services which make use of the new specification from receiving unexpected data. Services you publish using GeoEvent Manager at 10.9 / 10.9.1 will continue to use the default setting of FALSE for the datesInUnknownTimeZone parameter. ArcGIS Server feature services and Enterprise portal hosted feature layers will not change the default for this property when services are upgraded from 10.8.x to the 10.9.x release. Therefore, applications sending client queries should expect to receive the same epoch long integer values from feature services and hosted feature layers they have been receiving regardless of whether the services are (pre-upgrade) 10.8.x services or have been upgraded to the 10.9.x release. I do not have a 10.9.1 release candidate deployment I can use to check right now, so I would ask that you "trust but verify" what I've said above. You can examine an ArcGIS Server feature service's (or an Enterprise portal hosted feature layer service's) JSON specification by navigating to the service in the ArcGIS REST Services Directory and clicking the JSON link in the upper-left corner of the service's web page. I would look to see if the upgraded service's JSON specification has the datesInUnknownTimeZone parameter set TRUE or kept the FALSE default. I would then check that web applications (e.g web maps) you have configured display the expected date/time when looking at a feature record's attributes in a pop-up or table. I would also configure a GeoEvent Server input to query the ArcGIS Server feature service and/or Enterprise portal hosted feature layer you have upgraded to retrieve feature records and write those out to a JSON file. Use the https://www.epochconverter.com on-line utility to convert the epoch long integer GeoEvent Server has written out to its JSON data file to confirm the queried values. If you have any trouble with this, please open an incident with Esri Technical Support. (I will be running these same tests myself as I was previously unaware of the new service property and corresponding client query parameter.) -- RJ
... View more
10-28-2021
05:00 PM
|
0
|
0
|
1982
|
POST
|
Hey Dean -- What you indicate regarding copying a GeoEvent Definition which has been generated for you to apply tags or event to adjust data types is the better practice. I often have to adjust a GeoEvent Definition to specify that a numeric value be handled as a Long, or even as a Date when receiving epoch values, rather than allowing the more generic numeric type of Double data type to be used. Considering the application of a TRACK_ID tag, event definitions generated for you by an inbound connector or processor cannot accurately guess which field might hold a unique record identifier, so the TRACK_ID tag is not applied when generating a new GeoEvent Definition. I've also seen what you mention with duplicate GeoEvent Definitions being created especially when an XML snapshot of a configuration is loaded and what I refer to as "managed GeoEvent Definitions" are part of the snapshot. Frequently an input configured to create a GeoEvent Definition will create a new GeoEvent Definition when receiving data for the first time following an administrative reset and configuration import. The "duplicate" event definition created has the same schema and even the same name as the event definition captured by the snapshot. But the underlying GUID for the two event definitions is different which can lead to other problems. The less you allow components you configure to create GeoEvent Definitions for you on-the-fly the more distinct and specific your configuration becomes -- which tends to improve resiliency and overall solution reliability. Auto-generated GeoEvent Definitions can be useful, but should always be replaced with copies you have tailored specifically for your configuration and operations. Auto-generated GeoEvent Definitions are really just best guesses for an event record's data structure and the guesses are generally good, but rarely 100% matches of what you want. -- RJ
... View more
10-20-2021
11:20 AM
|
1
|
1
|
1147
|
POST
|
@EricSpangenberg -- Apologies that your enquiry went so long without a response. If you have not already, please open a support incident with Esri Technical Support for this issue. It is not feasible, and in fact can be quite dangerous from a security perspective, to attempt an in-place upgrade of a third-party component GeoEvent Server incorporates as part of its Karaf application container. Please indicate to Esri Technical Support the release of ArcGIS Enterprise and GeoEvent Server you are working with. If this is a vulnerability not covered by the ArcGIS GeoEvent Server Security Update 2021 Patch we need to evaluate this with our internal security team. If you are working with a release prior to 10.6.1 we probably need to discuss with you the possibly of upgrading to an ArcGIS Enterprise release for which an appropriate security patch is available. Either way, we need a support incident opened with Esri Technical Support for traceability. Thanks -- RJ
... View more
10-14-2021
11:46 AM
|
1
|
0
|
776
|
POST
|
Hey Gary -- I checked and the Processing Coordinate System WKID parameter is expecting a literal integer value. This particular processor is not set-up to handle dynamic substitution, so we cannot extract a WKID value from another attribute field to make the processor more dynamic with regard to spatial reference. I've added your use case as an enhancement for this processor, but cannot commit to a timeline for which the enhancement will be picked up for any particular release. For now, if data you receive spans a wide geographic area and you need to create a buffer or calculate a range fan, you will have to select and specify an integer WKID for a projection generally applicable across a wide area of operations. Since a projection like Web Mercator is "generally inaccurate everywhere", perhaps you can identify a "zone" in which the data resides and choose a projection more or less in the middle of that zone. That way, at least, you would not be working with 250+ highly accurate projections for specific areas within a U.S. State (since you mentioned the State Plane Coordinate System). If you select one projection for the "Pacific NorthWest" and another for the "Atlantic SouthEast" perhaps your range fans will be accurate enough -- at least more so than using a world-wide Web Mercator projection. -- RJ
... View more
10-14-2021
10:57 AM
|
1
|
1
|
665
|
POST
|
Hey Suzy -- Because JSON is self-describing and provides names for all the key/value pairs in a data structure, a input you configure is able to ignore "missing" data and handle the case when "extra" data arrives which is not represented in a GeoEvent Definition. Stated another way, if data received does not contain a particular key/value represented in the GeoEvent Definition being used to adapt that data, the attribute value in the adapted event record will be assigned a null value. If a received data record contains a key/value which is not represented in the GeoEvent Definition then the extra data will be ignored. You do not need to allow an input or inbound connector to create a GeoEvent Definition for you. In fact, after sending a bit of sample data to an input, so its adapter can create a GeoEvent Definition for you, the recommended best practice is to copy the auto-generated definition and make whatever edits are needed. You might want, for example, to handle a particular data value as a Date rather than a Double or handle a numeric value as a Long even though it is sent to you as a quoted String value. You then delete the auto-generated GeoEvent Definition and reconfigure your input to use the GeoEvent Definition you have reviewed and edited. As for integrating with an external API, I would assume you would first identify the REST web service endpoints you want to poll for data and obtain a sample of the data to help you administratively create a GeoEvent Definition which reflects the expected data structure. You can then configure an input to use that GeoEvent Definition as it adapts data records it receives in response to data requests it sends. (If you are asking for an automated way of creating data structures and updating them on-the-fly as new data feeds are discovered, I'm sorry, but I don't have a good solution for that.) Normally a data provider will provide a specification from which you can create a GeoEvent Definition. Sometimes the data provider will document an actual sample of their data -- and that is when you can allow your input to (temporarily) create a GeoEvent Definition for you based on the first data record it receives. But you will always want to copy, edit, and then delete the auto-generated event definition as a best practice. You can always delete an auto-generated GeoEvent Definition and allow the input to poll again to see if/how a feed's data has changed so that you can make adjustments to an authoritative GeoEvent Definition you are developing for a particular data feed. There is a "Learning Mode" that you can toggle 'on' to briefly give the inbound connector permission to adjust a GeoEvent Definition it has created based on variable data structures it receives -- but the adapter is only going to add newly observed key/value pairs as named attributes to an existing GeoEvent Definition to try and accommodate data it is just now seeing that it didn't see before. Even with "Learning Mode" toggled on the adapter is not going to recursively iterate through a received data structure. In the case we were discussing, if at one time the adapter saw an empty array and decided to handle/adapt data for that attribute as an array of String values, it will not change that attribute specification based on data seen later (which might be a collection of JSON elements with a specific sub-structure) to update the GeoEvent Definition and change String:Many to be Group:Many. The adapter will simply fail to adapt the array of JSON elements as primitive String values. You will have to administratively adjust the GeoEvent Definition the adapter is using to accommodate the type of data elements the data feed is actually placing into the now populated array. -- RJ
... View more
10-14-2021
10:32 AM
|
0
|
0
|
749
|
POST
|
Hello Dean -- My apologies that your inquiry has not received any responses. Have you opened an ticket on this with Esri Technical Support? I would like to have these sort of issues tracked. While I have not had a problem configuring Incident Detector in deployment I have set up for regression testing, I did have someone reach out to me that was having a problem with their deployment and configuration which I was unable to reproduce. If this is still an issue for you, and you are so inclined, please work with Esri Technical Support to try and establish a reproducible case.
... View more
10-13-2021
06:33 PM
|
0
|
1
|
1175
|
POST
|
William -- The product development approach for GeoEvent Server avoids developing connectors for specific devices or platforms. While there are some samples on the GeoEvent Server Gallery for data provider or device specific inbound connectors (e.g. Verizon Connect Connector for GeoEvent Server, or Geotab for GeoEvent Server), when considering connectors available out-of-the-box you generally want to identify what transport protocol will be used to send data to GeoEvent Server (e.g. HTTP, TCP, UDP, WebSocket) and how the data records will be formatted (e.g. generic JSON, generic XML, delimited text). The answer to these two questions will guide you toward a configurable out-of-the-box inbound connector. Since you mention the ActiveMQ connector, I'll offer that over time we have exposed (as SDK samples) connectors based on "message queues" and "message brokers" such as ActiveMQ, RabbitMQ, MQTT, and Kafka. Not all of the samples on the Gallery are plug-and-play with the latest release of GeoEvent Server. Connectors based on different message brokers follow GeoEvent Server's evolution. We started with ActiveMQ, migrated to RabbitMQ, and are now using Kafka internally to queue and manage event records. None of this changes GeoEvent Server integration fundamentals of identifying a data transport protocol and data record format. I am not aware of anything developed specifically for WebLogic Thin T3 Client. This could be something you develop a specific connector for using the GeoEvent Server Java SDK. Implementation support for this type of project is available through Esri Professional Services. Hope this information helps -- RJ
... View more
10-13-2021
06:25 PM
|
0
|
0
|
398
|
POST
|
Hello William -- The inbound connectors you configure assume that they will receive the latest avilable data from a sensor network in real-time or near-real-time. Fundamental assumptions are that data will arrive in temporal order, at some discrete frequency and periodicity. Data should not be sent in batches with data records potentially out-of-temporal-order. GeoEvent Server can receive batches of data, but the batch is assumed to be a collection of individual observations from discrete sensors, not a collection of observations from a single sensor. You might want to look into using SDK samples available on the GeoEvent Server Gallery to supplement your solution. The Delay Processor for GeoEvent Server or the Timetree Processor for GeoEvent Server may allow you to receive a collection of data observations, hold the data for a specified amount of time (e.g. "delay processing") and sort the data by TRACK_ID into a proper temporal order to guarantee processed event records reflect a first-in / first-out view of data collected from sensors in time order. If you need help working with these SDK samples, please open an incident with Esri Technical Support. Limited consulting is available through technical support. More in-depth help implementing a solution can be arranged through Esri Professional Services if needed. Hope this information helps -- RJ
... View more
10-13-2021
05:08 PM
|
0
|
0
|
394
|
POST
|
Hello Suzy -- The hierarchy in the JSON data structure you illustrated looks almost recursive. If this is a typical example of the JSON an input you have configured would expect to receive as its first data record, I think the inbound adapter is making its best guess at what the GeoEvent Definition ought to be. An inbound adapter's guess will be more accurate for simpler data structures -- the adapter will not recursively iterate through a data structure to further refine what it sees up-front. For example, in my illustration above I've added just a little white-space and formatting to your JSON example. The line I've designated "Ex. 01" in green is intended to be an array capable of holding zero or more data values. But the adapter, given the empty array in the illustration, does not know whether the array's data will (eventually) be a set of integer values, a set of string values, or a set of JSON elements with a more detailed sub-structure. Given the empty array, the inbound adapter makes a guess and assumes a data type of String and a cardinality for the data value of Many (indicated with the infinity symbol circled in green in the illustrated GeoEvent Definition). As long as the data values eventually received in that array can be implicitly cast to String the adapter will be able to parse and adapt data it (eventually) receives. Looking at another part of the data structure, designated "Ex. 02" and highlighted in orange, we see an "elements" array which contains JSON elements (as opposed to primitive String or Integer values). But the two JSON elements shown in the example have potentially different sub-structures. The first "attributes" array (boxed off in orange) is empty. As before the inbound adapter makes a guess and assumes an eventual data type of String setting the cardinality to Many (result circled in orange in the illustrated GeoEvent Definition). Again, the inbound adapter is not going to recuse more deeply into the data to see the other "attributes" (boxed off in blue) -- so it will not see that its first assumption is wrong and that "attributs" is probably not going to be an array of String values. The inbound adapter will not know that a key "attributes" found within "elements" is actually be intended to hold zero of more JSON elements with their own sub-structure. A quick test, removing the line I've designated "Ex. 02" above, confirms that a JSON inbound adapter receiving the JSON illustrated below will assume a data type Group for an "attributs" key found nested beneath a key "elements": I suspect the GeoEvent Definition in this second illustration is closer to what you were expecting. Hope this information helps -- RJ
... View more
10-13-2021
04:44 PM
|
1
|
0
|
756
|
Title | Kudos | Posted |
---|---|---|
1 | 01-05-2024 02:25 PM | |
1 | 01-09-2024 09:04 AM | |
1 | 01-08-2024 04:01 PM | |
1 | 10-16-2023 12:59 PM | |
1 | 07-03-2023 10:08 AM |
Online Status |
Online
|
Date Last Visited |
yesterday
|