DOC
|
Hey Russell -- The pattern match you propose [0-9,a-z,A-Z] should work for a mix of letters and numbers. The pattern specifies any single character, lower-case or upper-case letter or digit. The repetition qualifier {1,} specifies one or more repetitions matching this pattern. You could try using the \w metacharacter which specifies "Any Alphanumeric character". But your pattern is essentially the same thing. I use the online utility https://regex101.com to develop and test my regular expression patterns. Another good site offering a RegEx tutorial is https://regexone.com There are different flavors of RegEx, so to be careful I would select 'Java 8' in the regex101.com web tool's left-hand options frame. That site is nice in that it explains why the pattern match is matching the way it does. All you need to recognize then is that the function expression you configure your GeoEvent Server's Field Calculator with has three parameters: The data field, the pattern to match, and the replacement for every occurrence of that pattern (in this case a single literal character 'x').
... View more
07-28-2021
10:13 AM
|
0
|
0
|
1993
|
POST
|
Hello @kavi88 ... I would say that GeoEvent Server is able to handle null value input. Attribute values can be null and there should not be a runtime exception generated that creates a fault in event record processing. That doesn't mean that you'll be able to calculate a derivative value if the input values are null or if attribute values cannot be used in the expression you configure a Field Calculator to use. Suppose you receive some simple event record like: { "myDouble": 3.14159, "scaleFactor": 3.1, "calcResult": null } A Field Calculator configured with an expression myDouble * scaleFactor will be able to write the value 9.738929 into an existing field calcResult. But if one or more of the attribute fields contain null values: { "myDouble": 3.14159, "scaleFactor": null, "calcResult": null } You should expect to see some sort of error. You cannot multiple a Double and a null value, or implicitly cast a null or a literal string to a numeric value to allow a Field Calculator to compute a value. We do try not to make up data in cases where invalid values are received. We wouldn't want, for example, to assume a location of 0.0 latitude / 0.0 longitude because lat and lon values pulled out of a data structure were null. Suppose, rather than computing a Double value we were simply trying to place two Double values into a descriptive string. An expression like the following: 'My Double is: ' + myDouble + ' and my Scale Factor is: ' + scaleFactor + '.' Written into a String attribute would calculate a value something like: "My Double is: 3.14159 and my Scale Factor is: 3.1." If a null value were received for the scaleFactor an error message like the following is logged: Expression ['My Double is: ' + myDouble + ' and my Scale Factor is: ' + scaleFactor + '.'] evaluation failed: EVALUABLE_EVALUATION_FAILED_CAUSE The error message above is what is produced at the 10.9.x release. It may be that Field Calculator is logging less readable error messages at an earlier release, which would explain why you are seeing messages talking about arg0:[NonGroup], arg1:[NonGroup]. I know we improved the error messages that Field Calculator was logging at some point, but I don't remember which s/w release has those changes. Regardless, if an expression uses attribute field(s) whose value(s) are null ... you should probably expect to see some sort of error logged and the computed result receive a null value. The problem you are trying to solve has several different places where something can go wrong. I have frequently encountered, for example, data in a rich, complex hierarchical structure not necessarily being 100% homogenous across all of the levels in the hierarchy. It could easily be the case, for example, that the "impacted_objects" for a "disruption" do not have a "stop point" defined. It may be that there is no value at a hierarchical path disruptions[idx].impacted_objects[idx].impacted_stops[idx].stop_point.coord.lat or if an attribute exists at that level in the data structure, its value is null. I would assume that after you use the serialized multicardinal field splitter processors to flatten out all of the levels in the data structure, you'll have to use a couple of filters to test whether valid lat and lon values can be retrieved and log a "disruption" identifier to a file when a "stop_point" location cannot be calculated rather than trying to calculate a string representation of a geometry using null values. - RJ
... View more
07-26-2021
12:09 PM
|
1
|
0
|
2186
|
POST
|
Hey @kavi88 -- When using a Field Calculator to construct a "Geometry" you are actually calculating a String representation of a Geometry. When I need to confirm the string calculation I will often configure a Field Calculator to write its string representation to a String attribute field and then map the String to a Geometry attribute field. You can configure a Field Calculator to write its string representation directly into a Geometry attribute field, but the single step means that you are asking for an implicit type cast from String -- the value calculated as a single-quoted literal -- to a Geometry. If the string value does not exactly match the required formatting for a Point geometry object, the Field Calculator's attempt to write its string into a Geometry field will fail. So, to Eric's point, you might want to route event records emitted from the GEOM_CONSTRUCTION Field Calculator you configured to a JSON File so that you can get a good look at the String the processor constructed for you, to make sure it matches the formatting of a Point geometry object. You can probably drop the two Field Calculator processors LatConverter and LonConverter from the event processing workflow. You can configure the MAPPING FIELDS Field Mapper to map your latitude and longitude attribute values from String to Double by simply mapping the attribute values into Double fields. This is just another implicit cast, like when using Field Calculator to compute a string representation of a geometry, but writing the computed string into a Geometry field. If I had to guess, the problem you're having is probably in the serialized event schema flattening. Placing five Multicardinal Field Splitter processors in series is more than I've ever had to do to simplify a hierarchical data structure. It's either that, or the string representation Point geometry object being calculated doesn't match the ArcGIS REST API specification of a Point geometry. As a debugging step, you might try using dot notation to pull a single pair of latitude and longitude values our of the hierarchical data structure, using a Field Mapper to map the entirety of the data structure down to an event record whose GeoEvent Definition has exactly two Double attributes (one named lat and one named lon). Then work with that very simple event record to debug the field calculation you need to perform to construct a JSON representation of a Point geometry object. disruptions[0].impacted_objects[0].impacted_stops[0].stop_point.coord.lat => lat disruptions[0].impacted_objects[0].impacted_stops[0].stop_point.coord.lon => lon I wrote the above without actual data to look at and test, so I am not 100% sure I have the notation correct. If you need help with this I would ask that you open a technical support incident with Esri Support. What I'm trying to do above is take the zero-th value from each group element whose cardinality is 'Many' (indicating the JSON element is a zero-based indexed list of values) to pull a single "stop point" coordinate's latitude and longitude out so that the values can be used in a Field Calculator. You'll still need to use the Multicardinal Field Splitters eventually so that you run calculations on all of the stop points, but the above can help you debug to make sure the string calculation of the Point geometry object is being done correctly. Hope this helps -- RJ cross-reference: JSON Data Structures - Working with Hierarchy and Multicardinality
... View more
07-23-2021
10:51 AM
|
2
|
0
|
2208
|
POST
|
Philip -- Your solution using an outbound connector which is essentially an No Operation component is a bit orthogonal to GeoEvent Server's design. What you're doing is one reason that we don't offer out-of-the-box processors with capabilities to invoke a GP Service, for example. We certainly could, as a GP service is as RESTful as other web services GeoEvent Server is interfacing with. But the question becomes do we want to block a processor node's flow as it waits on a response from a GP Service? This wouldn't be feasible when trying to process hundreds of event messages per second. Or do we allow the processor to invoke an asynchronous GP service task/job and have the processor send the logical equivalent of "nothing" or "process pending" along to an outbound connector? That's not consistent with GeoEvent Server's design. GeoEvent Server is fundamentally accepting data, adapting the data to produce individual event records, then processing each of those event records atomically (without retaining or caching data from an event record unless absolutely necessary), so that data from a processed event record can be routed along to an outbound connector for dissemination. Your solution appears to be developer-centric and highly customized. If I understand what you're saying you have a custom inbound adapter, a custom processor, and now a custom outbound connector. If the GeoEvent Server's Java SDK allows you to develop a solution using GeoEvent Server as a platform for event record processing -- that's great -- but I'm not sure that the product team can be of much help moving forward. I will offer that the multicardinality and hierarchy supported by a GeoEvent Definition is not specific to JSON. How data is ingested and adapted is not tied to a specific data format (e.g. JSON object format). Every event record has a GeoEvent Definition which describes the event record's data structure. This event definition applies only to the interior of an event record object. There is no mechanism which allows you to define a group, list, or hierarchy of multiple event records. A GeoEvent Definition can specify a data structure which includes a list of Java primitive values (e.g. Date, Double, Long, String, ...) and/or incorporate a non-primitive type Group which includes multiple primitive values as a sub-structure within the overall data structure. But this all sill describes the data structure of a single event record object. The hierarchy and multicardinality concepts discussed in the article you found do not apply to collections of multiple event records.
... View more
04-21-2021
10:38 AM
|
0
|
0
|
579
|
POST
|
Hey Philip -- GeoEvent Server's processing of event data was designed to be atomic. Every event record is processed individually. Generally speaking a processor does not know anything about event records recently processed or event records in the pipeline about to be processed. It only knows what data is in the event record is has received that needs to be processed. There are exceptions, of course. A filter or processor needing to evaluate a Enter condition for example needs to know if the previous event for a given tracked asset (identified using the TRACK_ID tag) was "outside" or "disjoint" so that it can determine that the event record it just received, which is "inside" or "intersects" has entered the area of interest. You might look at the Timetree processor, a custom processor whose source code is available in a GitHub repository here, as an example of a custom processor designed to collect and cache a number of event records in order to perform some processing on a collection of received data records. But as you say, you'll have to design some sort of parameterization so the processor knows when to stop collecting data and start processing the collection. I don't think it's possible to configure a GeoEvent Definition such that the data structure represents an amalgamation of multiple event records. Since every event record must have an associated GeoEvent Definition specifying the event record's data structure - I don't think you'll be able to do what you're asking. But I'll check with a colleague and reply back if it turns out this is possible and something reasonably accomplished. -- RJ
... View more
04-20-2021
06:35 PM
|
0
|
2
|
591
|
POST
|
Hey Adam -- The Field Mapper processor was designed to flatten a schema to make the event record's data structure compatible with the ArcGIS REST Services API used when sending processed event data as JSON to a feature service's addFeatures or updateFeatures endpoint. You cannot use Field Mapper ... or any of the out-of-the-box processors ... to write data to a hierarchical structure. It appears, from your illustration, that event data being ingested is already adapted using flattened data structure (e.g. the cardinality of every event record attribute is '1' and the data type is Date, Double, Long, String, (etc.) ... not Group). I think you'll want to consider developing a custom outbound adapter which is able to take a flat data structure and adapt it into a hierarchical data structure expected by the Web Hook you want to receive data you've processed through a GeoEvent Service.
... View more
04-20-2021
06:22 PM
|
0
|
0
|
3802
|
POST
|
Hey Adam -- Serializing a JSON Object (e.g. the collection and structure of key:value pairs in-between the outermost curl-braces) as a String can be done. It's not easy. You might take a look at the community thread How to switch positions on coordinates which illustrates a series of Field Calculator processors, each using a replaceAll( ) function with regular expression pattern matching, to perform some manipulation on a received String. The goal in that thread is to take the received data string and turn it into a JSON string representation of a polygon geometry. So it's not serializing JSON with all of its embedded double-quotes, square brackets, curl-braces, and commas that's a problem. I would recommend taking a step back to think about what you're trying to do. It could be very difficult to extract values from event record attributes for title, text, type, title (potential duplicate attribute name!), and value and insert them into a properly formatted JSON string so that you can send the String as a JSON Object to an external receiver using a Push JSON to an External Website outbound connector. That higher level challenge aside, the problem you're running into, I think, is that there's an embedded single quote in your data: If you remove that embedded single-quote you can, as you suggested, wrap the whole serialized JSON string in a pair of single quotes and copy/paste it into a Field Calculator processor's expression. The bigger challenge is going to be designing a GeoEvent Service that accepts data, extracts values from that data, and computes derivative values to place into a hierarchical JSON structure of this complexity. -- RJ
... View more
04-14-2021
06:27 PM
|
1
|
2
|
3840
|
POST
|
Hello Dinesh -- The first part of your question is relatively easy. If you were to poll a feature service to obtain a set of feature records whose associated geometry were a polygon modeling a "project boundary" you could route each event record through a GeoTagger processor to enrich the event record with the (unique identifier) names of point geofence imported from a feature service providing the point locations of towers. What you have now is a comma delimited list of towers that fall within a project boundary. The difficulty is three-fold. First, GeoEvent Server does not provide any sort of iterator to inspect individual items in a list. You don't know how many towers are expected to be in any given project boundary, so you cannot further enrich the "project boundary" event record with the "alert status" for each tower ... because you cannot iterate across the list of towers to query their alert status. You could use a Field Splitter processor from the GeoEvent Gallery to split a comma delimited list of towers in an enriched (geotagged) project boundary event record. This would produce separate event records, one for each tower in the project boundary. You could then enrich a second time to get the tower's alert status and add it to the project boundary event record. But each event record emitted from a Field Splitter is processed atomically (individually). This is the second challenge / limitation ... you cannot compare attributes from one event record with attributes in another event record. The third challenge, as I see it, is there is no easy way to compare one tower's alert status to another and pick the greater of the two, especially when you don't know how many towers there are. I've never tried, for example, to design bitwise arithmetic into a GeoEvent Service to logically OR two bit sequences 0x0100 and 0x0010 to produce 0x0110 and then determine the highest-order bit set in the sequence. The logical operations GeoEvent Server supports are much more general (e.g. determining if an event record's string is empty or null to set a Boolean result to 'true' or 'false' and then comparing that 'true' / 'false' value against another Boolean to determine what to do with the singular event record being processed. You might approach the problem using a GeoTagger as described above to get the name of point geofences in an area of interest, splitting the event record using a Field Splitter to produce several independent event records, and then use a Field Enricher to look-up the alert status for each event record's associated tower. You could then use an Update a Feature output to have GeoEvent Server make a REST request on a feature service to update the alert for an entire area (or project boundary) ... but you'll need some sort of database trigger to catch that request and only allow it to proceed if the alert value is equal-to or greater-than the feature record's current alert value. Otherwise, as I'm sure you realize, GeoEvent Server's serialized event processing stream will simply overwrite the project boundary feature record's alert status with the most recently processed tower's status. Hope this information helps you think through the analysis you want to perform. -- RJ
... View more
04-12-2021
04:49 PM
|
0
|
0
|
585
|
POST
|
The ArcGIS Data Store should be installed and configured on a machine other than the one used to run GeoEvent Server and the ArcGIS Server beneath which GeoEvent Server is run. Especially when configuring the spatiotemporial big data store. Please refer to the following resources: https://www.esri.com/content/dam/esrisites/en-us/media/technical-papers/architecting-the-arcgis-system.pdf. (concept of workload separation) https://enterprise.arcgis.com/en/get-started/latest/windows/additional-server-deployment.htm#ESRI_SECTION1_F7B03953E7864058970E591E9D2CE859 (system architecture illustrations which show the base enterprise, GeoEvent Server, and spatiotemporial big data store all on separate machines)
... View more
04-01-2021
08:19 PM
|
1
|
0
|
4440
|
POST
|
@Ctal_GISsquad - Please see my reply to your question in the thread Converting between Date Formats Clicking here should take you directly to my reply in the thread.
... View more
02-26-2021
05:06 PM
|
1
|
1
|
931
|
POST
|
If data you are receiving contains only a date value (e.g. 12/31/2021 ) without a time, this is not a pattern GeoEvent Server recognizes without you specifying a an Expected Date Format the inbound adapter can use to figure out how to parse a String as a Date. You would have to specify a value like MM/dd/yyyy when configuring your inbound connector. The connector will apply this pattern to all event record attributes whose data type is Date in the GeoEvent Definition used by the inbound connector. When I send the String value "12/31/2021" to my GeoEvent Server with the Expected Date Format configuration described above, the Date value my inbound adapter constructs for me from the received string is 1640937600000. This is an epoch value used by Java to represent date/time values. GeoEvent Server uses millisecond epoch values, which is why the value has 13 digits rather than only 10. If I ask GeoEvent Sever to cast its Date to a String I get a representation of the date which looks like "Fri Dec 31 00:00:00 PST 2021". Notice that the string has both a "date" and a "time" and includes the Time Zone for the expressed date/time value. In this case, the Date is expressed in the Pacific time zone. This is because an Expected Date Format pattern was specified -- which is required to handle the inbound string which does not match one of the few built-in expected patterns for a date/time value. The time zone handling is important to note because, in this case, the date/time is not in UTC units. GeoEvent Server assumes that the non-standard date/time must be a date/time local to my solution, so it uses the locale of my server (whose clock is configured to use the Pacific Time Zone). Focusing on your question, if you are receiving a string value which is somehow being adapted to produce the epoch date value 1640908800000 (which could also be represented as "Thursday, December 30, 2021 4:00:00 PM GMT-08:00" or "Thu Dec 30 16:00:00 PST 2021") and you need to truncate the value to be simply "Thursday, December 30" ... you have a couple of options. I strongly recommend you make sure you understand how the received data is actually being adapted, and check to verify how client applications are representing the value in web map pop-ups or web forms. A client application will likely represent a Java Epoch date/time value it receives, when querying a feature service for feature records for example, in whatever time zone the client web application is running. The value 1640908800000 already represents the date/time "Friday, December 31, 2021 12:00:00 AM" when a UTC value is assumed and web clients are likely going to try and represent an assumed UTC date/time in whatever time zone the web application is running. If you were to add or subtract some number of milliseconds from the epoch to drop the "time" portion and keep only the whole "date" value, your effort is likely going to have unintended consequences client-side. You could use a RegEx pattern match on a value toString(myDate) to isolate the whole hours portion of the "time" and then multiply this by 3,600,000 (which is 60 min x 60 sec x 1000 ms) and then subtract that from your Date using a Field Calculator. The eventual expression would be something like: myDate - (16 * 3600 * 1000) This assumes you are able to extract the value "16" from a string "Thu Dec 30 16:00:00 PST 2021" to know that you wanted to subtract 16 hours worth of milliseconds from the myDate attribute value. You also might want to look at some of the supported expressions for the Field Calculator processor. The function currentOffsetUTC() specifically computes the millisecond difference between your GeoEvent Server's locale and UTC. Since my server is configured to use the Pacific Time Zone, which is currently -08:00 hours behind UTC, the currentOffsetUTC() function returns a value -28800000, which is (8 hours x 60 minutes x 60 seconds x 1000 milliseconds). You might scale the computed value by some constant when performing date/time adjustment arithmetic, or more likely, shift an epoch Date from an assumed local time zone so that the value represents a UTC value. The advantage of using currentOffsetUTC() is that the function automatically recognizes changes in daylight savings, so you don't have to rely memory to update GeoEvent Services twice a year when a fixed constant value you might have hard-coded in an expression no longer reflects the observance of daylight savings time. See Also: What time is it? Well That Depends...
... View more
02-26-2021
04:34 PM
|
1
|
0
|
3521
|
POST
|
Hello Shital, There is no problem including multiple Field Calculator processors in a single GeoEvent Service. You sometimes have to "chain" a series of Field Calculators together to compute intermediate values and then perform calculations on those intermediate values. An example of this can be seen in the GeoNet thread How to switch positions on coordinates If you know that data you are receiving is in epoch seconds, you can scale the received Long integer value by multiplying by 1000 and write the computed result to a field whose type is Date. For example, illustrated below is a GeoEvent Service whose input receives a single long integer value. The GeoEvent Definition used by the input has two additional fields, another Long and a Date, who's values are adapted as null when no values are provided in the received data structure. The first Field Calculator multiplies dt_seconds (a Long) by 1000 and writes the result into a field dt_long (also a Long). The second Field Calculator uses the exact same expression but writes the result into a field dt_date which forces the Field Calculator to perform an implicit conversion from long integer to Date. I've chosen to show the input as JSON received over REST and the output as delimited text as that makes it clear what the data values are. Input: [{"dt_seconds": 1613600457}] Output: JsonReceiver,1613600457,1613600457000,2021-02-17T14:20:57.000-08:00 Note that the name of the GeoEvent Definition used by all nodes in the GeoEvent Service is JsonReceiver (the TEXT outbound adapter prepends that to the comma delimited values is produces). Also, the TEXT output can be configured to format Date values as ISO 8601 (as shown). You can use https://www.epochconverter.com to convert either the dt_seconds or the computed dt_long to show that either can be used by a system to represent the date/time shown formatted as an ISO 8601 string. I hope this helps -- RJ
... View more
02-17-2021
03:04 PM
|
1
|
0
|
1554
|
POST
|
The Expected Date Format property is only applicable to a single attribute field whose data type, as specified by the event record's GeoEvent Definition, is a Date. An inbound connector, like the one you are using (SystemFile/Text) would have to specify an existing GeoEvent Definition to use (vs. allowing the input to create one for you) in order to specify which attribute field(s) should be handled as Date values. However, you can only specify a single Expected Date Format mask, which means that both your "date" and your "time" would have to be part of a single attribute. An inbound adapter cannot handle the conversion of a "date" and a "time" when the values are in separate attribute fields. None of the configurable processors in the GeoEvent Server releases up through the current release (10.8.1) provide a way to "cast" data from a String to a Date. This means your only opportunity to adapt a string representation of a date/time and create an actual Date value is as part of an inbound connector's adapter. Once data has passed from an inbound connector into a GeoEvent Service you won't be able to "cast" a value to a Date unless the value happens to be an epoch value such as 784041330000. The good news, looking forward, is that the 10.9 release will have a toDate( ) function which both the Field Mapper and the Field Calculator will be able to use. The new function will allow two or more string values from separate attributes to be combined with a literal string to produce an ISO 8601 formatted date/time string. You will be able to send data to GeoEvent Server with a "time" and a "date" in separate fields ... for example: "dateString": "1994-11-05" and "timeString": "08:15:30-05:00" ... and write an expression such as toDate(dateString + 'T' + timeString) to create a single date/time string and write the result out to a field of type Date. - RJ reference: https://www.w3.org/TR/NOTE-datetime
... View more
01-20-2021
08:05 PM
|
1
|
1
|
1617
|
POST
|
Hello Ctal - I would ask that you please submit an incident with Esri Technical support so that someone can be assigned specifically to follow up with you. If you would please include a sample of the tab-delimited data your File/Text inbound connector is receiving, that will help reproduce the issue. I'm going to take a guess and assume that you have configured a Watch a Folder for New CSV Files input using the input's Expected Date Format property to specify a Java SimpleDateFormat string. Data coming from your SQL database has only the "time" portion of a date/time string and thus is not one of the few well-known string formats GeoEvent Server recognizes for specifying a date/time value. GeoEvent Server uses epoch values (in milliseconds) to specify Date values. Both date and time are inherently part of every Date value since the values are not a time-of-day string but rather a number of milliseconds since the Unix Epoch (Midnight Jan 1 1970). GeoEvent Server's inbound adapter will not be able to cast a string value such as "23:45:15" to a Date value "(today) 11:45:15 PM (time zone) 2020" unless the string representation of the date/time incorporates a date as well as a time. Hope this information is helpful – RJ
... View more
01-19-2021
12:05 PM
|
1
|
0
|
1624
|
BLOG
|
In this blog we will take a deeper look at registering server connections with GeoEvent Server, something administrators commonly have to do to when configuring a GeoEvent Server deployment.
Why do you have to register an ArcGIS Server connection with GeoEvent Server
There are several different configurable components which require you to select a registered server and specify which services folder, map/feature service, and often a specific feature layer exposed by that service in order to use the feature layer's schema or feature records. A few examples include:
The Poll an ArcGIS Server for Features input used to query a service for feature records.
When you want to import a GeoEvent Definition from a feature service's schema.
The Field Enricher (Feature Service) processor used to load a feature record set to use for enrichment.
The Add a Feature and Update a Feature outputs used to persist data from processed event records as feature records in a geodatabase.
All of the user-interfaces in the above examples populate their options from a cache of information GeoEvent Server collects when it queries a registered ArcGIS Server to discover published feature services. Service discovery is not performed the moment you click to open the panel. The cache is created and updated in the background because it can take several seconds – sometimes minutes or even tens of minutes – for GeoEvent Server to completely crawl the ArcGIS Server's REST Services Directory and query the information it needs from all of the available services.
Which leads into the topic I want to address in this blog: Is there a way to know when service discovery is being run and how long it is expected to take?
Using the ArcGIS Server Connection component to log messages
You can configure the following component logger to request DEBUG messages be logged:
com.esri.ges.datastore.agsconnection.DefaultArcGISServerConnection
I've included a sample of the logged messages you will be looking for at the end of this article. Requesting this component logger include DEBUG messages in its logging will allow you to see, in the system log file, when the service discovery kicks off and which map/feature services it interrogates to learn about their layers. A message will be logged for every feature service being interrogated as well as a success message when service discovery is complete. There is no indication in the GeoEvent Manager web application that service discovery is about to start or is currently running. The best way to tell that service discovery is running is to start the workflow to import a GeoEvent Definition. If the blue/white indicator displays, requesting you "please wait", you know that the GeoEvent Server is busy updating its ArcGIS Services cache. Otherwise the GeoEvent Definition user-interface will display immediately allowing you to choose a server connection, folder, feature service, and feature layer.
GeoEvent Manager does not allow you to configure, or schedule, when service discovery should take place. You are able to change the Discovery Rate for each server connection you register to specify how frequently a refresh should be performed. The default for recent software releases is every 60 minutes. I have personally found that when I use ArcGIS Pro, the Enterprise portal, or GeoEvent Manager to publish a new feature service, I want to use it now – so it is not uncommon for me to publish a feature service and immediately request GeoEvent Server run a service discovery to update its cache by clicking the refresh button on the server connection I have registered as a Data Store. This eliminates the need to have GeoEvent Server periodically refresh the cache for me, so I usually set the Discovery Rate for server connections I register to a fairly large value like 1440 minutes so that service discovery is run once once per day (or when GeoEvent Server is stopped and restarted).
Service discovery takes too long and interferes with normal operations. Is there anything I can do?
This is something the product team is working on. Refactoring GeoEvent Server to support all of the operations which use feature services, however, to interface with ArcGIS Server some other way is both high risk and high reward. Several different design options have been considered, but implementation has had to be deferred for each of the last several major releases. The best option for now, therefore, is to limit the number of features services which are discoverable each time service discovery is run.
A recommended best practice is to configure your GeoEvent Server Data Store (e.g. the server connection you are registering with GeoEvent Server) with credentials. The user credentials do not have to be an administrative user, just a user who owns and published the feature service. You can use the Enterprise portal content item manager to assign existing feature services a new owner and configure GeoEvent Server to authenticate as that user when crawling the ArcGIS Server's REST Services Directory to limit the number of discoverable services. GeoEvent Server administrators often configure their registered server connections with the ArcGIS Server or Enterprise portal primary administrative account – which naturally sees all published services.
If you can identify the feature services to which you want GeoEvent Server to write data, or from which you want GeoEvent Server to retrieve feature records or a feature layer's schema, and assign ownership of just those services to a user set aside specifically for your "real-time" data, you can improve service discovery considerably by not crawling all of the feature services being maintained by more traditional feature editing workflows. The trick is to identify the feature services your GeoEvent Server components actually care about and limit discovery to only those feature services.
Example messages logged by the ArcGIS Server Connection component logger
2020-11-05T13:55:02,124 | DEBUG | Public_Esri_Hosted_Server-Updater | DefaultArcGISServerConnection | 56 - com.esri.ges.framework.datastore.agsconnection-datastore - 10.8.1 | sleep interrupted
java.lang.InterruptedException: sleep interrupted
at java.lang.Thread.sleep(Native Method) ~[?:?]
at com.esri.ges.datastore.agsconnection.DefaultArcGISServerConnection$CacheUpdater.run
(DefaultArcGISServerConnection.java:235) [56:com.esri.ges.framework.datastore.agsconnection-datastore:10.8.1]
2020-11-05T13:55:02,124 | DEBUG | Public_Esri_Hosted_Server-Updater | DefaultArcGISServerConnection | 56 - com.esri.ges.framework.datastore.agsconnection-datastore - 10.8.1 | Exiting Cache Updater run method....
2020-11-05T13:55:03,274 | DEBUG | qtp808353329-598 | DefaultArcGISServerConnection | 56 - com.esri.ges.framework.datastore.agsconnection-datastore - 10.8.1 | Create an instance of CacheUpdater....
2020-11-05T13:55:03,274 | DEBUG | Public_Esri_Hosted_Server-Updater | DefaultArcGISServerConnection | 56 - com.esri.ges.framework.datastore.agsconnection-datastore - 10.8.1 | Updating cache for DataStore Public_Esri_Hosted_Server...
2020-11-05T13:55:08,944 | DEBUG | Public_Esri_Hosted_Server-Updater | DefaultArcGISServerConnection | 56 - com.esri.ges.framework.datastore.agsconnection-datastore - 10.8.1 | Updating layers for Service (Folder: "/", Name: "Active_Hurricanes_Sampler", Type: "FeatureServer").
2020-11-05T13:55:15,296 | DEBUG | Public_Esri_Hosted_Server-Updater | DefaultArcGISServerConnection | 56 - com.esri.ges.framework.datastore.agsconnection-datastore - 10.8.1 | Updating layers for Service (Folder: "/", Name: "Active_Hurricanes_v1", Type: "FeatureServer").
2020-11-05T13:55:21,613 | DEBUG | Public_Esri_Hosted_Server-Updater | DefaultArcGISServerConnection | 56 - com.esri.ges.framework.datastore.agsconnection-datastore - 10.8.1 | Updating layers for Service (Folder: "/", Name: "Air_Quality_PM25_Latest_Results", Type: "FeatureServer").
2020-11-05T13:55:23,311 | DEBUG | Public_Esri_Hosted_Server-Updater | DefaultArcGISServerConnection | 56 - com.esri.ges.framework.datastore.agsconnection-datastore - 10.8.1 | Updating layers for Service (Folder: "/", Name: "ASAM_events_V1", Type: "FeatureServer").
2020-11-05T13:55:25,033 | DEBUG | Public_Esri_Hosted_Server-Updater | DefaultArcGISServerConnection | 56 - com.esri.ges.framework.datastore.agsconnection-datastore - 10.8.1 | Updating layers for Service (Folder: "/", Name: "Coral_Reef_Stations", Type: "FeatureServer").
2020-11-05T13:55:27,876 | DEBUG | Public_Esri_Hosted_Server-Updater | DefaultArcGISServerConnection | 56 - com.esri.ges.framework.datastore.agsconnection-datastore - 10.8.1 | Updating layers for Service (Folder: "/", Name: "GDELT_Health_Pandemic", Type: "FeatureServer").
2020-11-05T13:55:29,170 | DEBUG | Public_Esri_Hosted_Server-Updater | DefaultArcGISServerConnection | 56 - com.esri.ges.framework.datastore.agsconnection-datastore - 10.8.1 | Updating layers for Service (Folder: "/", Name: "GDELT_v1_Social_Tones", Type: "FeatureServer").
2020-11-05T13:55:30,368 | DEBUG | Public_Esri_Hosted_Server-Updater | DefaultArcGISServerConnection | 56 - com.esri.ges.framework.datastore.agsconnection-datastore - 10.8.1 | Updating layers for Service (Folder: "/", Name: "IHME_Projected_Peaks_QA", Type: "FeatureServer").
2020-11-05T13:55:31,995 | DEBUG | Public_Esri_Hosted_Server-Updater | DefaultArcGISServerConnection | 56 - com.esri.ges.framework.datastore.agsconnection-datastore - 10.8.1 | Updating layers for Service (Folder: "/", Name: "Median_Sea_Ice_Extent_for_the_Antarctic", Type: "FeatureServer").
2020-11-05T13:55:32,995 | DEBUG | Public_Esri_Hosted_Server-Updater | DefaultArcGISServerConnection | 56 - com.esri.ges.framework.datastore.agsconnection-datastore - 10.8.1 | Updating layers for Service (Folder: "/", Name: "Median_Sea_Ice_Extent_for_the_Arctic", Type: "FeatureServer").
2020-11-05T13:55:34,063 | DEBUG | Public_Esri_Hosted_Server-Updater | DefaultArcGISServerConnection | 56 - com.esri.ges.framework.datastore.agsconnection-datastore - 10.8.1 | Updating layers for Service (Folder: "/", Name: "MODIS_Thermal_v1", Type: "FeatureServer").
2020-11-05T13:55:35,225 | DEBUG | Public_Esri_Hosted_Server-Updater | DefaultArcGISServerConnection | 56 - com.esri.ges.framework.datastore.agsconnection-datastore - 10.8.1 | Updating layers for Service (Folder: "/", Name: "National_Farmers_Market_Directory", Type: "FeatureServer").
2020-11-05T13:55:36,445 | DEBUG | Public_Esri_Hosted_Server-Updater | DefaultArcGISServerConnection | 56 - com.esri.ges.framework.datastore.agsconnection-datastore - 10.8.1 | Updating layers for Service (Folder: "/", Name: "ncov_cases", Type: "FeatureServer").
2020-11-05T13:55:38,421 | DEBUG | Public_Esri_Hosted_Server-Updater | DefaultArcGISServerConnection | 56 - com.esri.ges.framework.datastore.agsconnection-datastore - 10.8.1 | Updating layers for Service (Folder: "/", Name: "NDFD_Ice_v1", Type: "FeatureServer").
2020-11-05T13:55:40,579 | DEBUG | Public_Esri_Hosted_Server-Updater | DefaultArcGISServerConnection | 56 - com.esri.ges.framework.datastore.agsconnection-datastore - 10.8.1 | Updating layers for Service (Folder: "/", Name: "NDFD_Precipitation_v1", Type: "FeatureServer").
2020-11-05T13:55:43,479 | DEBUG | Public_Esri_Hosted_Server-Updater | DefaultArcGISServerConnection | 56 - com.esri.ges.framework.datastore.agsconnection-datastore - 10.8.1 | Updating layers for Service (Folder: "/", Name: "NDFD_SnowFall_v1", Type: "FeatureServer").
2020-11-05T13:55:45,920 | DEBUG | Public_Esri_Hosted_Server-Updater | DefaultArcGISServerConnection | 56 - com.esri.ges.framework.datastore.agsconnection-datastore - 10.8.1 | Updating layers for Service (Folder: "/", Name: "NDFD_WindForecast_v1", Type: "FeatureServer").
2020-11-05T13:55:49,935 | DEBUG | Public_Esri_Hosted_Server-Updater | DefaultArcGISServerConnection | 56 - com.esri.ges.framework.datastore.agsconnection-datastore - 10.8.1 | Updating layers for Service (Folder: "/", Name: "NDFD_WindGust_v1", Type: "FeatureServer").
2020-11-05T13:55:51,519 | DEBUG | Public_Esri_Hosted_Server-Updater | DefaultArcGISServerConnection | 56 - com.esri.ges.framework.datastore.agsconnection-datastore - 10.8.1 | Updating layers for Service (Folder: "/", Name: "NDFD_WindSpeed_v1", Type: "FeatureServer").
2020-11-05T13:55:53,124 | DEBUG | Public_Esri_Hosted_Server-Updater | DefaultArcGISServerConnection | 56 - com.esri.ges.framework.datastore.agsconnection-datastore - 10.8.1 | Updating layers for Service (Folder: "/", Name: "NDGD_SmokeForecast_v1", Type: "FeatureServer").
2020-11-05T13:55:54,193 | DEBUG | Public_Esri_Hosted_Server-Updater | DefaultArcGISServerConnection | 56 - com.esri.ges.framework.datastore.agsconnection-datastore - 10.8.1 | Updating layers for Service (Folder: "/", Name: "NOAA_METAR_current_wind_speed_direction_v1", Type: "FeatureServer").
2020-11-05T13:55:55,683 | DEBUG | Public_Esri_Hosted_Server-Updater | DefaultArcGISServerConnection | 56 - com.esri.ges.framework.datastore.agsconnection-datastore - 10.8.1 | Updating layers for Service (Folder: "/", Name: "NOAA_short_term_warnings_v1", Type: "FeatureServer").
2020-11-05T13:55:58,401 | DEBUG | Public_Esri_Hosted_Server-Updater | DefaultArcGISServerConnection | 56 - com.esri.ges.framework.datastore.agsconnection-datastore - 10.8.1 | Updating layers for Service (Folder: "/", Name: "NOAA_storm_reports_Sampler", Type: "FeatureServer").
2020-11-05T13:56:00,844 | DEBUG | Public_Esri_Hosted_Server-Updater | DefaultArcGISServerConnection | 56 - com.esri.ges.framework.datastore.agsconnection-datastore - 10.8.1 | Updating layers for Service (Folder: "/", Name: "NOAA_storm_reports_v1", Type: "FeatureServer").
2020-11-05T13:56:06,494 | DEBUG | Public_Esri_Hosted_Server-Updater | DefaultArcGISServerConnection | 56 - com.esri.ges.framework.datastore.agsconnection-datastore - 10.8.1 | Updating layers for Service (Folder: "/", Name: "NWS_Watches_Warnings_Sampler", Type: "FeatureServer").
2020-11-05T13:56:12,942 | DEBUG | Public_Esri_Hosted_Server-Updater | DefaultArcGISServerConnection | 56 - com.esri.ges.framework.datastore.agsconnection-datastore - 10.8.1 | Updating layers for Service (Folder: "/", Name: "NWS_Watches_Warnings_v1", Type: "FeatureServer").
2020-11-05T13:56:19,622 | DEBUG | Public_Esri_Hosted_Server-Updater | DefaultArcGISServerConnection | 56 - com.esri.ges.framework.datastore.agsconnection-datastore - 10.8.1 | Updating layers for Service (Folder: "/", Name: "Recent_Hurricanes_v1", Type: "FeatureServer").
2020-11-05T13:56:21,726 | DEBUG | Public_Esri_Hosted_Server-Updater | DefaultArcGISServerConnection | 56 - com.esri.ges.framework.datastore.agsconnection-datastore - 10.8.1 | Updating layers for Service (Folder: "/", Name: "Satellite_VIIRS_Thermal_Hotspots_and_Fire_Activity", Type: "FeatureServer").
2020-11-05T13:56:22,871 | DEBUG | Public_Esri_Hosted_Server-Updater | DefaultArcGISServerConnection | 56 - com.esri.ges.framework.datastore.agsconnection-datastore - 10.8.1 | Updating layers for Service (Folder: "/", Name: "seaice_extent_N_v1", Type: "FeatureServer").
2020-11-05T13:56:23,976 | DEBUG | Public_Esri_Hosted_Server-Updater | DefaultArcGISServerConnection | 56 - com.esri.ges.framework.datastore.agsconnection-datastore - 10.8.1 | Updating layers for Service (Folder: "/", Name: "seaice_extent_S_v1", Type: "FeatureServer").
2020-11-05T13:56:25,081 | DEBUG | Public_Esri_Hosted_Server-Updater | DefaultArcGISServerConnection | 56 - com.esri.ges.framework.datastore.agsconnection-datastore - 10.8.1 | Updating layers for Service (Folder: "/", Name: "SPI_recent", Type: "FeatureServer").
2020-11-05T13:56:28,250 | DEBUG | Public_Esri_Hosted_Server-Updater | DefaultArcGISServerConnection | 56 - com.esri.ges.framework.datastore.agsconnection-datastore - 10.8.1 | Updating layers for Service (Folder: "/", Name: "Standardized_Precipitation_Index_(SPI)", Type: "FeatureServer").
2020-11-05T13:56:31,492 | DEBUG | Public_Esri_Hosted_Server-Updater | DefaultArcGISServerConnection | 56 - com.esri.ges.framework.datastore.agsconnection-datastore - 10.8.1 | Updating layers for Service (Folder: "/", Name: "US_Cases_per_county_(time)", Type: "FeatureServer").
2020-11-05T13:56:33,073 | DEBUG | Public_Esri_Hosted_Server-Updater | DefaultArcGISServerConnection | 56 - com.esri.ges.framework.datastore.agsconnection-datastore - 10.8.1 | Updating layers for Service (Folder: "/", Name: "USA_Wildfires_v1", Type: "FeatureServer").
2020-11-05T13:56:35,074 | DEBUG | Public_Esri_Hosted_Server-Updater | DefaultArcGISServerConnection | 56 - com.esri.ges.framework.datastore.agsconnection-datastore - 10.8.1 | Updating layers for Service (Folder: "/", Name: "USGS_Seismic_Data_v1", Type: "FeatureServer").
2020-11-05T13:56:36,595 | DEBUG | Public_Esri_Hosted_Server-Updater | DefaultArcGISServerConnection | 56 - com.esri.ges.framework.datastore.agsconnection-datastore - 10.8.1 | Updating cache for DataStore Public_Esri_Hosted_Server done. Success: true.
You can see from the above that discovery of 36 feature services hosted by an external, public server, took just over 90 seconds. The more services there are to discover, the longer service discovery will take.
See Also:
GeoEvent Server > Administer > Data stores in the GeoEvent Server on-line help
Debug Techniques - Configuring the application logger GeoEvent Server blog series
GeoEvent Configuration: Data Store Connections w/Tokens blog by Eric Ironside
... View more
11-05-2020
05:40 PM
|
6
|
0
|
1594
|
Title | Kudos | Posted |
---|---|---|
1 | 01-05-2024 02:25 PM | |
1 | 01-09-2024 09:04 AM | |
1 | 01-08-2024 04:01 PM | |
1 | 10-16-2023 12:59 PM | |
1 | 07-03-2023 10:08 AM |
Online Status |
Online
|
Date Last Visited |
yesterday
|