POST
|
@Ctal_GISsquad - Please see my reply to your question in the thread Converting between Date Formats Clicking here should take you directly to my reply in the thread.
... View more
|
1
|
1
|
35
|
POST
|
If data you are receiving contains only a date value (e.g. 12/31/2021 ) without a time, this is not a pattern GeoEvent Server recognizes without you specifying a an Expected Date Format the inbound adapter can use to figure out how to parse a String as a Date. You would have to specify a value like MM/dd/yyyy when configuring your inbound connector. The connector will apply this pattern to all event record attributes whose data type is Date in the GeoEvent Definition used by the inbound connector. When I send the String value "12/31/2021" to my GeoEvent Server with the Expected Date Format configuration described above, the Date value my inbound adapter constructs for me from the received string is 1640937600000. This is an epoch value used by Java to represent date/time values. GeoEvent Server uses millisecond epoch values, which is why the value has 13 digits rather than only 10. If I ask GeoEvent Sever to cast its Date to a String I get a representation of the date which looks like "Fri Dec 31 00:00:00 PST 2021". Notice that the string has both a "date" and a "time" and includes the Time Zone for the expressed date/time value. In this case, the Date is expressed in the Pacific time zone. This is because an Expected Date Format pattern was specified -- which is required to handle the inbound string which does not match one of the few built-in expected patterns for a date/time value. The time zone handling is important to note because, in this case, the date/time is not in UTC units. GeoEvent Server assumes that the non-standard date/time must be a date/time local to my solution, so it uses the locale of my server (whose clock is configured to use the Pacific Time Zone). Focusing on your question, if you are receiving a string value which is somehow being adapted to produce the epoch date value 1640908800000 (which could also be represented as "Thursday, December 30, 2021 4:00:00 PM GMT-08:00" or "Thu Dec 30 16:00:00 PST 2021") and you need to truncate the value to be simply "Thursday, December 30" ... you have a couple of options. I strongly recommend you make sure you understand how the received data is actually being adapted, and check to verify how client applications are representing the value in web map pop-ups or web forms. A client application will likely represent a Java Epoch date/time value it receives, when querying a feature service for feature records for example, in whatever time zone the client web application is running. The value 1640908800000 already represents the date/time " Friday, December 31, 2021 12:00:00 AM " when a UTC value is assumed and web clients are likely going to try and represent an assumed UTC date/time in whatever time zone the web application is running. If you were to add or subtract some number of milliseconds from the epoch to drop the "time" portion and keep only the whole "date" value, your effort is likely going to have unintended consequences client-side. You could use a RegEx pattern match on a value toString(myDate) to isolate the whole hours portion of the "time" and then multiply this by 3,600,000 (which is 60 min x 60 sec x 1000 ms) and then subtract that from your Date using a Field Calculator. The eventual expression would be something like: myDate - (16 * 3600 * 1000) This assumes you are able to extract the value "16" from a string "Thu Dec 30 16:00:00 PST 2021" to know that you wanted to subtract 16 hours worth of milliseconds from the myDate attribute value. You also might want to look at some of the supported expressions for the Field Calculator processor. The function currentOffsetUTC() specifically computes the millisecond difference between your GeoEvent Server's locale and UTC. Since my server is configured to use the Pacific Time Zone, which is currently -08:00 hours behind UTC, the currentOffsetUTC() function returns a value -28800000, which is (8 hours x 60 minutes x 60 seconds x 1000 milliseconds). You might scale the computed value by some constant when performing date/time adjustment arithmetic, or more likely, shift an epoch Date from an assumed local time zone so that the value represents a UTC value. The advantage of using currentOffsetUTC() is that the function automatically recognizes changes in daylight savings, so you don't have to rely memory to update GeoEvent Services twice a year when a fixed constant value you might have hard-coded in an expression no longer reflects the observance of daylight savings time. See Also: What time is it? Well That Depends...
... View more
|
1
|
0
|
18
|
POST
|
Hello Shital, There is no problem including multiple Field Calculator processors in a single GeoEvent Service. You sometimes have to "chain" a series of Field Calculators together to compute intermediate values and then perform calculations on those intermediate values. An example of this can be seen in the GeoNet thread How to switch positions on coordinates If you know that data you are receiving is in epoch seconds, you can scale the received Long integer value by multiplying by 1000 and write the computed result to a field whose type is Date. For example, illustrated below is a GeoEvent Service whose input receives a single long integer value. The GeoEvent Definition used by the input has two additional fields, another Long and a Date, who's values are adapted as null when no values are provided in the received data structure. The first Field Calculator multiplies dt_seconds (a Long) by 1000 and writes the result into a field dt_long (also a Long). The second Field Calculator uses the exact same expression but writes the result into a field dt_date which forces the Field Calculator to perform an implicit conversion from long integer to Date. I've chosen to show the input as JSON received over REST and the output as delimited text as that makes it clear what the data values are. Input: [{"dt_seconds": 1613600457}] Output: JsonReceiver,1613600457,1613600457000,2021-02-17T14:20:57.000-08:00 Note that the name of the GeoEvent Definition used by all nodes in the GeoEvent Service is JsonReceiver (the TEXT outbound adapter prepends that to the comma delimited values is produces). Also, the TEXT output can be configured to format Date values as ISO 8601 (as shown). You can use https://www.epochconverter.com to convert either the dt_seconds or the computed dt_long to show that either can be used by a system to represent the date/time shown formatted as an ISO 8601 string. I hope this helps -- RJ
... View more
2 weeks ago
|
1
|
0
|
66
|
POST
|
The Expected Date Format property is only applicable to a single attribute field whose data type, as specified by the event record's GeoEvent Definition, is a Date. An inbound connector, like the one you are using (SystemFile/Text) would have to specify an existing GeoEvent Definition to use (vs. allowing the input to create one for you) in order to specify which attribute field(s) should be handled as Date values. However, you can only specify a single Expected Date Format mask, which means that both your "date" and your "time" would have to be part of a single attribute. An inbound adapter cannot handle the conversion of a "date" and a "time" when the values are in separate attribute fields. None of the configurable processors in the GeoEvent Server releases up through the current release (10.8.1) provide a way to "cast" data from a String to a Date. This means your only opportunity to adapt a string representation of a date/time and create an actual Date value is as part of an inbound connector's adapter . Once data has passed from an inbound connector into a GeoEvent Service you won't be able to "cast" a value to a Date unless the value happens to be an epoch value such as 784041330000. The good news, looking forward, is that the 10.9 release will have a toDate( ) function which both the Field Mapper and the Field Calculator will be able to use. The new function will allow two or more string values from separate attributes to be combined with a literal string to produce an ISO 8601 formatted date/time string. You will be able to send data to GeoEvent Server with a "time" and a "date" in separate fields ... for example: "dateString": "1994-11-05" and "timeString": "08:15:30-05:00" ... and write an expression such as toDate(dateString + 'T' + timeString) to create a single date/time string and write the result out to a field of type Date. - RJ reference: https://www.w3.org/TR/NOTE-datetime
... View more
01-20-2021
08:05 PM
|
1
|
1
|
129
|
POST
|
Hello Ctal - I would ask that you please submit an incident with Esri Technical support so that someone can be assigned specifically to follow up with you. If you would please include a sample of the tab-delimited data your File/Text inbound connector is receiving, that will help reproduce the issue. I'm going to take a guess and assume that you have configured a Watch a Folder for New CSV Files input using the input's Expected Date Format property to specify a Java SimpleDateFormat string. Data coming from your SQL database has only the "time" portion of a date/time string and thus is not one of the few well-known string formats GeoEvent Server recognizes for specifying a date/time value. GeoEvent Server uses epoch values (in milliseconds) to specify Date values. Both date and time are inherently part of every Date value since the values are not a time-of-day string but rather a number of milliseconds since the Unix Epoch (Midnight Jan 1 1970). GeoEvent Server's inbound adapter will not be able to cast a string value such as "23:45:15" to a Date value "(today) 11:45:15 PM (time zone) 2020" unless the string representation of the date/time incorporates a date as well as a time. Hope this information is helpful – RJ
... View more
01-19-2021
12:05 PM
|
1
|
0
|
136
|
BLOG
|
In this blog we will take a deeper look at registering server connections with GeoEvent Server, something administrators commonly have to do to when configuring a GeoEvent Server deployment. Why do you have to register an ArcGIS Server connection with GeoEvent Server There are several different configurable components which require you to select a registered server and specify which services folder, map/feature service, and often a specific feature layer exposed by that service in order to use the feature layer's schema or feature records. A few examples include: The Poll an ArcGIS Server for Features input used to query a service for feature records. When you want to import a GeoEvent Definition from a feature service's schema . The Field Enricher (Feature Service) processor used to load a feature record set to use for enrichment. The Add a Feature and Update a Feature outputs used to persist data from processed event records as feature records in a geodatabase. All of the user-interfaces in the above examples populate their options from a cache of information GeoEvent Server collects when it queries a registered ArcGIS Server to discover published feature services. Service discovery is not performed the moment you click to open the panel. The cache is created and updated in the background because it can take several seconds – sometimes minutes or even tens of minutes – for GeoEvent Server to completely crawl the ArcGIS Server's REST Services Directory and query the information it needs from all of the available services. Which leads into the topic I want to address in this blog: Is there a way to know when service discovery is being run and how long it is expected to take? Using the ArcGIS Server Connection component to log messages You can configure the following component logger to request DEBUG messages be logged: com.esri.ges.datastore.agsconnection.DefaultArcGISServerConnection I've included a sample of the logged messages you will be looking for at the end of this article. Requesting this component logger include DEBUG messages in its logging will allow you to see, in the system log file, when the service discovery kicks off and which map/feature services it interrogates to learn about their layers. A message will be logged for every feature service being interrogated as well as a success message when service discovery is complete. There is no indication in the GeoEvent Manager web application that service discovery is about to start or is currently running. The best way to tell that service discovery is running is to start the workflow to import a GeoEvent Definition. If the blue/white indicator displays, requesting you "please wait", you know that the GeoEvent Server is busy updating its ArcGIS Services cache. Otherwise the GeoEvent Definition user-interface will display immediately allowing you to choose a server connection, folder, feature service, and feature layer. GeoEvent Manager does not allow you to configure, or schedule, when service discovery should take place. You are able to change the Discovery Rate for each server connection you register to specify how frequently a refresh should be performed. The default for recent software releases is every 60 minutes. I have personally found that when I use ArcGIS Pro, the Enterprise portal, or GeoEvent Manager to publish a new feature service, I want to use it now – so it is not uncommon for me to publish a feature service and immediately request GeoEvent Server run a service discovery to update its cache by clicking the refresh button on the server connection I have registered as a Data Store. This eliminates the need to have GeoEvent Server periodically refresh the cache for me, so I usually set the Discovery Rate for server connections I register to a fairly large value like 1440 minutes so that service discovery is run once once per day (or when GeoEvent Server is stopped and restarted). Service discovery takes too long and interferes with normal operations. Is there anything I can do? This is something the product team is working on. Refactoring GeoEvent Server to support all of the operations which use feature services, however, to interface with ArcGIS Server some other way is both high risk and high reward. Several different design options have been considered, but implementation has had to be deferred for each of the last several major releases. The best option for now, therefore, is to limit the number of features services which are discoverable each time service discovery is run. A recommended best practice is to configure your GeoEvent Server Data Store (e.g. the server connection you are registering with GeoEvent Server) with credentials. The user credentials do not have to be an administrative user, just a user who owns and published the feature service. You can use the Enterprise portal content item manager to assign existing feature services a new owner and configure GeoEvent Server to authenticate as that user when crawling the ArcGIS Server's REST Services Directory to limit the number of discoverable services. GeoEvent Server administrators often configure their registered server connections with the ArcGIS Server or Enterprise portal primary administrative account – which naturally sees all published services. If you can identify the feature services to which you want GeoEvent Server to write data, or from which you want GeoEvent Server to retrieve feature records or a feature layer's schema, and assign ownership of just those services to a user set aside specifically for your "real-time" data, you can improve service discovery considerably by not crawling all of the feature services being maintained by more traditional feature editing workflows. The trick is to identify the feature services your GeoEvent Server components actually care about and limit discovery to only those feature services. Example messages logged by the ArcGIS Server Connection component logger 2020 - 11 - 05T13 : 55 : 02 , 124 | DEBUG | Public_Esri_Hosted_Server - Updater | DefaultArcGISServerConnection | 56 - com . esri . ges . framework . datastore . agsconnection - datastore - 10.8 . 1 | sleep interrupted java . lang . InterruptedException : sleep interrupted at java . lang . Thread . sleep ( Native Method ) ~ [ ? : ? ] at com . esri . ges . datastore . agsconnection . DefaultArcGISServerConnection$CacheUpdater . run ( DefaultArcGISServerConnection . java : 235 ) [ 56 : com . esri . ges . framework . datastore . agsconnection - datastore : 10.8 . 1 ] 2020 - 11 - 05T13 : 55 : 02 , 124 | DEBUG | Public_Esri_Hosted_Server - Updater | DefaultArcGISServerConnection | 56 - com . esri . ges . framework . datastore . agsconnection - datastore - 10.8 . 1 | Exiting Cache Updater run method . . . . 2020 - 11 - 05T13 : 55 : 03 , 274 | DEBUG | qtp808353329 - 598 | DefaultArcGISServerConnection | 56 - com . esri . ges . framework . datastore . agsconnection - datastore - 10.8 . 1 | Create an instance of CacheUpdater . . . . 2020 - 11 - 05T13 : 55 : 03 , 274 | DEBUG | Public_Esri_Hosted_Server - Updater | DefaultArcGISServerConnection | 56 - com . esri . ges . framework . datastore . agsconnection - datastore - 10.8 . 1 | Updating cache for DataStore Public_Esri_Hosted_Server . . . 2020 - 11 - 05T13 : 55 : 08 , 944 | DEBUG | Public_Esri_Hosted_Server - Updater | DefaultArcGISServerConnection | 56 - com . esri . ges . framework . datastore . agsconnection - datastore - 10.8 . 1 | Updating layers for Service ( Folder : "/" , Name : "Active_Hurricanes_Sampler" , Type : "FeatureServer" ) . 2020 - 11 - 05T13 : 55 : 15 , 296 | DEBUG | Public_Esri_Hosted_Server - Updater | DefaultArcGISServerConnection | 56 - com . esri . ges . framework . datastore . agsconnection - datastore - 10.8 . 1 | Updating layers for Service ( Folder : "/" , Name : "Active_Hurricanes_v1" , Type : "FeatureServer" ) . 2020 - 11 - 05T13 : 55 : 21 , 613 | DEBUG | Public_Esri_Hosted_Server - Updater | DefaultArcGISServerConnection | 56 - com . esri . ges . framework . datastore . agsconnection - datastore - 10.8 . 1 | Updating layers for Service ( Folder : "/" , Name : "Air_Quality_PM25_Latest_Results" , Type : "FeatureServer" ) . 2020 - 11 - 05T13 : 55 : 23 , 311 | DEBUG | Public_Esri_Hosted_Server - Updater | DefaultArcGISServerConnection | 56 - com . esri . ges . framework . datastore . agsconnection - datastore - 10.8 . 1 | Updating layers for Service ( Folder : "/" , Name : "ASAM_events_V1" , Type : "FeatureServer" ) . 2020 - 11 - 05T13 : 55 : 25 , 033 | DEBUG | Public_Esri_Hosted_Server - Updater | DefaultArcGISServerConnection | 56 - com . esri . ges . framework . datastore . agsconnection - datastore - 10.8 . 1 | Updating layers for Service ( Folder : "/" , Name : "Coral_Reef_Stations" , Type : "FeatureServer" ) . 2020 - 11 - 05T13 : 55 : 27 , 876 | DEBUG | Public_Esri_Hosted_Server - Updater | DefaultArcGISServerConnection | 56 - com . esri . ges . framework . datastore . agsconnection - datastore - 10.8 . 1 | Updating layers for Service ( Folder : "/" , Name : "GDELT_Health_Pandemic" , Type : "FeatureServer" ) . 2020 - 11 - 05T13 : 55 : 29 , 170 | DEBUG | Public_Esri_Hosted_Server - Updater | DefaultArcGISServerConnection | 56 - com . esri . ges . framework . datastore . agsconnection - datastore - 10.8 . 1 | Updating layers for Service ( Folder : "/" , Name : "GDELT_v1_Social_Tones" , Type : "FeatureServer" ) . 2020 - 11 - 05T13 : 55 : 30 , 368 | DEBUG | Public_Esri_Hosted_Server - Updater | DefaultArcGISServerConnection | 56 - com . esri . ges . framework . datastore . agsconnection - datastore - 10.8 . 1 | Updating layers for Service ( Folder : "/" , Name : "IHME_Projected_Peaks_QA" , Type : "FeatureServer" ) . 2020 - 11 - 05T13 : 55 : 31 , 995 | DEBUG | Public_Esri_Hosted_Server - Updater | DefaultArcGISServerConnection | 56 - com . esri . ges . framework . datastore . agsconnection - datastore - 10.8 . 1 | Updating layers for Service ( Folder : "/" , Name : "Median_Sea_Ice_Extent_for_the_Antarctic" , Type : "FeatureServer" ) . 2020 - 11 - 05T13 : 55 : 32 , 995 | DEBUG | Public_Esri_Hosted_Server - Updater | DefaultArcGISServerConnection | 56 - com . esri . ges . framework . datastore . agsconnection - datastore - 10.8 . 1 | Updating layers for Service ( Folder : "/" , Name : "Median_Sea_Ice_Extent_for_the_Arctic" , Type : "FeatureServer" ) . 2020 - 11 - 05T13 : 55 : 34 , 063 | DEBUG | Public_Esri_Hosted_Server - Updater | DefaultArcGISServerConnection | 56 - com . esri . ges . framework . datastore . agsconnection - datastore - 10.8 . 1 | Updating layers for Service ( Folder : "/" , Name : "MODIS_Thermal_v1" , Type : "FeatureServer" ) . 2020 - 11 - 05T13 : 55 : 35 , 225 | DEBUG | Public_Esri_Hosted_Server - Updater | DefaultArcGISServerConnection | 56 - com . esri . ges . framework . datastore . agsconnection - datastore - 10.8 . 1 | Updating layers for Service ( Folder : "/" , Name : "National_Farmers_Market_Directory" , Type : "FeatureServer" ) . 2020 - 11 - 05T13 : 55 : 36 , 445 | DEBUG | Public_Esri_Hosted_Server - Updater | DefaultArcGISServerConnection | 56 - com . esri . ges . framework . datastore . agsconnection - datastore - 10.8 . 1 | Updating layers for Service ( Folder : "/" , Name : "ncov_cases" , Type : "FeatureServer" ) . 2020 - 11 - 05T13 : 55 : 38 , 421 | DEBUG | Public_Esri_Hosted_Server - Updater | DefaultArcGISServerConnection | 56 - com . esri . ges . framework . datastore . agsconnection - datastore - 10.8 . 1 | Updating layers for Service ( Folder : "/" , Name : "NDFD_Ice_v1" , Type : "FeatureServer" ) . 2020 - 11 - 05T13 : 55 : 40 , 579 | DEBUG | Public_Esri_Hosted_Server - Updater | DefaultArcGISServerConnection | 56 - com . esri . ges . framework . datastore . agsconnection - datastore - 10.8 . 1 | Updating layers for Service ( Folder : "/" , Name : "NDFD_Precipitation_v1" , Type : "FeatureServer" ) . 2020 - 11 - 05T13 : 55 : 43 , 479 | DEBUG | Public_Esri_Hosted_Server - Updater | DefaultArcGISServerConnection | 56 - com . esri . ges . framework . datastore . agsconnection - datastore - 10.8 . 1 | Updating layers for Service ( Folder : "/" , Name : "NDFD_SnowFall_v1" , Type : "FeatureServer" ) . 2020 - 11 - 05T13 : 55 : 45 , 920 | DEBUG | Public_Esri_Hosted_Server - Updater | DefaultArcGISServerConnection | 56 - com . esri . ges . framework . datastore . agsconnection - datastore - 10.8 . 1 | Updating layers for Service ( Folder : "/" , Name : "NDFD_WindForecast_v1" , Type : "FeatureServer" ) . 2020 - 11 - 05T13 : 55 : 49 , 935 | DEBUG | Public_Esri_Hosted_Server - Updater | DefaultArcGISServerConnection | 56 - com . esri . ges . framework . datastore . agsconnection - datastore - 10.8 . 1 | Updating layers for Service ( Folder : "/" , Name : "NDFD_WindGust_v1" , Type : "FeatureServer" ) . 2020 - 11 - 05T13 : 55 : 51 , 519 | DEBUG | Public_Esri_Hosted_Server - Updater | DefaultArcGISServerConnection | 56 - com . esri . ges . framework . datastore . agsconnection - datastore - 10.8 . 1 | Updating layers for Service ( Folder : "/" , Name : "NDFD_WindSpeed_v1" , Type : "FeatureServer" ) . 2020 - 11 - 05T13 : 55 : 53 , 124 | DEBUG | Public_Esri_Hosted_Server - Updater | DefaultArcGISServerConnection | 56 - com . esri . ges . framework . datastore . agsconnection - datastore - 10.8 . 1 | Updating layers for Service ( Folder : "/" , Name : "NDGD_SmokeForecast_v1" , Type : "FeatureServer" ) . 2020 - 11 - 05T13 : 55 : 54 , 193 | DEBUG | Public_Esri_Hosted_Server - Updater | DefaultArcGISServerConnection | 56 - com . esri . ges . framework . datastore . agsconnection - datastore - 10.8 . 1 | Updating layers for Service ( Folder : "/" , Name : "NOAA_METAR_current_wind_speed_direction_v1" , Type : "FeatureServer" ) . 2020 - 11 - 05T13 : 55 : 55 , 683 | DEBUG | Public_Esri_Hosted_Server - Updater | DefaultArcGISServerConnection | 56 - com . esri . ges . framework . datastore . agsconnection - datastore - 10.8 . 1 | Updating layers for Service ( Folder : "/" , Name : "NOAA_short_term_warnings_v1" , Type : "FeatureServer" ) . 2020 - 11 - 05T13 : 55 : 58 , 401 | DEBUG | Public_Esri_Hosted_Server - Updater | DefaultArcGISServerConnection | 56 - com . esri . ges . framework . datastore . agsconnection - datastore - 10.8 . 1 | Updating layers for Service ( Folder : "/" , Name : "NOAA_storm_reports_Sampler" , Type : "FeatureServer" ) . 2020 - 11 - 05T13 : 56 : 00 , 844 | DEBUG | Public_Esri_Hosted_Server - Updater | DefaultArcGISServerConnection | 56 - com . esri . ges . framework . datastore . agsconnection - datastore - 10.8 . 1 | Updating layers for Service ( Folder : "/" , Name : "NOAA_storm_reports_v1" , Type : "FeatureServer" ) . 2020 - 11 - 05T13 : 56 : 06 , 494 | DEBUG | Public_Esri_Hosted_Server - Updater | DefaultArcGISServerConnection | 56 - com . esri . ges . framework . datastore . agsconnection - datastore - 10.8 . 1 | Updating layers for Service ( Folder : "/" , Name : "NWS_Watches_Warnings_Sampler" , Type : "FeatureServer" ) . 2020 - 11 - 05T13 : 56 : 12 , 942 | DEBUG | Public_Esri_Hosted_Server - Updater | DefaultArcGISServerConnection | 56 - com . esri . ges . framework . datastore . agsconnection - datastore - 10.8 . 1 | Updating layers for Service ( Folder : "/" , Name : "NWS_Watches_Warnings_v1" , Type : "FeatureServer" ) . 2020 - 11 - 05T13 : 56 : 19 , 622 | DEBUG | Public_Esri_Hosted_Server - Updater | DefaultArcGISServerConnection | 56 - com . esri . ges . framework . datastore . agsconnection - datastore - 10.8 . 1 | Updating layers for Service ( Folder : "/" , Name : "Recent_Hurricanes_v1" , Type : "FeatureServer" ) . 2020 - 11 - 05T13 : 56 : 21 , 726 | DEBUG | Public_Esri_Hosted_Server - Updater | DefaultArcGISServerConnection | 56 - com . esri . ges . framework . datastore . agsconnection - datastore - 10.8 . 1 | Updating layers for Service ( Folder : "/" , Name : "Satellite_VIIRS_Thermal_Hotspots_and_Fire_Activity" , Type : "FeatureServer" ) . 2020 - 11 - 05T13 : 56 : 22 , 871 | DEBUG | Public_Esri_Hosted_Server - Updater | DefaultArcGISServerConnection | 56 - com . esri . ges . framework . datastore . agsconnection - datastore - 10.8 . 1 | Updating layers for Service ( Folder : "/" , Name : "seaice_extent_N_v1" , Type : "FeatureServer" ) . 2020 - 11 - 05T13 : 56 : 23 , 976 | DEBUG | Public_Esri_Hosted_Server - Updater | DefaultArcGISServerConnection | 56 - com . esri . ges . framework . datastore . agsconnection - datastore - 10.8 . 1 | Updating layers for Service ( Folder : "/" , Name : "seaice_extent_S_v1" , Type : "FeatureServer" ) . 2020 - 11 - 05T13 : 56 : 25 , 081 | DEBUG | Public_Esri_Hosted_Server - Updater | DefaultArcGISServerConnection | 56 - com . esri . ges . framework . datastore . agsconnection - datastore - 10.8 . 1 | Updating layers for Service ( Folder : "/" , Name : "SPI_recent" , Type : "FeatureServer" ) . 2020 - 11 - 05T13 : 56 : 28 , 250 | DEBUG | Public_Esri_Hosted_Server - Updater | DefaultArcGISServerConnection | 56 - com . esri . ges . framework . datastore . agsconnection - datastore - 10.8 . 1 | Updating layers for Service ( Folder : "/" , Name : "Standardized_Precipitation_Index_(SPI)" , Type : "FeatureServer" ) . 2020 - 11 - 05T13 : 56 : 31 , 492 | DEBUG | Public_Esri_Hosted_Server - Updater | DefaultArcGISServerConnection | 56 - com . esri . ges . framework . datastore . agsconnection - datastore - 10.8 . 1 | Updating layers for Service ( Folder : "/" , Name : "US_Cases_per_county_(time)" , Type : "FeatureServer" ) . 2020 - 11 - 05T13 : 56 : 33 , 073 | DEBUG | Public_Esri_Hosted_Server - Updater | DefaultArcGISServerConnection | 56 - com . esri . ges . framework . datastore . agsconnection - datastore - 10.8 . 1 | Updating layers for Service ( Folder : "/" , Name : "USA_Wildfires_v1" , Type : "FeatureServer" ) . 2020 - 11 - 05T13 : 56 : 35 , 074 | DEBUG | Public_Esri_Hosted_Server - Updater | DefaultArcGISServerConnection | 56 - com . esri . ges . framework . datastore . agsconnection - datastore - 10.8 . 1 | Updating layers for Service ( Folder : "/" , Name : "USGS_Seismic_Data_v1" , Type : "FeatureServer" ) . 2020 - 11 - 05T13 : 56 : 36 , 595 | DEBUG | Public_Esri_Hosted_Server - Updater | DefaultArcGISServerConnection | 56 - com . esri . ges . framework . datastore . agsconnection - datastore - 10.8 . 1 | Updating cache for DataStore Public_Esri_Hosted_Server done . Success : true . You can see from the above that discovery of 36 feature services hosted by an external, public server, took just over 90 seconds. The more services there are to discover, the longer service discovery will take. See Also: GeoEvent Server > Administer > Data stores in the GeoEvent Server on-line help Debug Techniques - Configuring the application logger GeoEvent Server blog series GeoEvent Configuration: Data Store Connections w/Tokens blog by Eric Ironside
... View more
11-05-2020
05:40 PM
|
3
|
0
|
344
|
POST
|
Hello jess neuner – You're correct that clicking Publish Stream Service a "second" time is not going to show you the existing service's configuration. All that does is open the dialog to allow you to begin selecting (again) the properties you want to specify to the service publication mechanism in order to publish a service. I'm assuming you toggled the Store Latest capability when originally publishing the stream service, and want to know which feature service the stream service is using to store latest observations? Your best bet, if I'm following your question, is to browse to the stream service in the ArcGIS REST Services Directory. At the top of the service's web page, in the upper-left hand corner, you should see a JSON link; clicking this will open a browser tab with the service's raw JSON specification. In there you will find things like which field is being used as a track identifier (derived from the field tagged TRACK_ID in the selected GeoEvent Definition) and the URL of any supporting feature services for capabilities like "Store Latest" and "Related Features". If you open an incident with Esri Technical Support, we can probably publish a stream service with store latest similar to how you have in your environment, then deliberately use the stream service to broadcast some data which the feature service would not be able to accept or persist as a feature record in the geodatabase and see how that condition gets logged in the GeoEvent Server's karaf.log ... Hope this information is helpful – RJ
... View more
10-12-2020
01:10 PM
|
1
|
0
|
174
|
POST
|
Matej – Using my test data above, I can configure three MCFS (multicardinal field splitter) processors to split the data first on the group element metric_collection , then on the group element Level1_Group3 , and finally on the array Group3_Items . Following this pattern I think you'll be able to achieve what you need to do. Above, I've illustrated the three event records routed from the final field splitter to the output. Please make sure you have downloaded the latest release of the field splitter processor bundle (Release 8 - June 24, 2020) for release 10.8.1 of ArcGIS. This version includes changes which allow multiple MCFS processors to be arrayed in series. If you are using an earlier release of ArcGIS, please use Release 7 (February 27, 2020). Hope this information is helpful – RJ
... View more
08-27-2020
02:02 PM
|
0
|
1
|
186
|
POST
|
Hello Matej – Unlike a Field Mapper or Field Calculator which needs you to specify the full "path" into a data structure, the Field Splitter and in particular Multicardinal Field Splitter processors only expect the name of the element at which you want the split to be applied. So the Field to Split parameter's value should only specify the element's base name, linearWithinLinearGNElement , not the full path to the element (e.g. groupOfLocations.globalNetworkLinear.linearWithinLinearGNElement ). That said, you may have found a limit of what the processor is able to handle in terms of a deeply nested, hierarchical, multicardinal, data structure. To test what you are trying to do I created some sample JSON data which adapts using the GeoEvent Definition illustrated below. Note that it does not matter whether the data received is formatted as XML or JSON – the inbound connector has to adapt whatever data is received to create an event record which can be routed to a GeoEvent Service for processing. After receiving the JSON above and allowing the inbound connector to adapt it using the illustrated GeoEvent Definition, I can use a Field Mapper to extract, for example, the data value "Delta" (line 21 in the sample JSON) by specifying the full path down through the data structure to the "E301" element in the array: metric_collection.Level1_Group3.Group3_Items[1].E301 If I specify the same full path when configuring the Multicardinal Field Splitter, I see the error message you were able to capture from the system log: Field name <name> is invalid . However, if I specify only Group3_Items (the name of the array) as the field on which to conduct the split, I see a different error message in the log indicating a null pointer exception was encountered. I am in contact with one of our developers to review the source code of this processor. We will see what we can do to better handle the case where Group3_Items (the name of the array) is specified and why a null pointer is not being caught. The work around, for now, is to use multiple Multicardinal Field Splitter processors in series and "walk down" the hierarchy from the top to the multicardinal element on which you want the final split applied. I'll reply to this post with what I tested so you can see what I had to do to implement the work around. Please make sure you have downloaded the latest release of the field splitter processor bundle (Release 8 - June 24, 2020) for release 10.8.1 of ArcGIS. This version includes changes which allow multiple MCFS processors to be arrayed in series. If you are using an earlier release of ArcGIS, please use Release 7 (February 27, 2020). Hope this information is helpful – RJ
... View more
08-26-2020
03:41 PM
|
1
|
2
|
186
|
POST
|
Hello Jitendrudu – When replies were written for the original thread Re: GeoEvent process Oracle/SQL connector I don't think the GeoEvent Manager supported publishing a feature service. The functional server role for licensing hadn’t been invented yet and GeoEvent Server was still being referred to as the GeoEvent Processor Extension for ArcGIS Server. In recent releases, however, you can use GeoEvent Manager to publish Hosted Feature Layers to your Enterprise portal or map/feature services to a stand-alone ArcGIS Server. So on the one hand, nothing has changed with regard to the interface between GeoEvent Server and an RDMBS table. The interface is still a feature service since GeoEvent Server can only make REST requests against an Esri ArcGIS Server feature service to add or update feature records. But on the other, you can now publish feature services which do not necessarily have Geometry. A feature service is typically published using a client, like ArcMap, which will not allow you to publish a feature service without a geometry. ArcMap defines a feature class as a collection of feature records and a feature record is a data structure with both geometry and attributes describing some object in the real-world. GeoEvent doesn’t make the same assumptions regarding Geometry that ArcMap does though. Real-time data streaming from a sensor whose position does not change wouldn’t necessarily include a geometry or coordinate values with its data. GeoEvent Server doesn’t assume that all data records will have geometry. To GeoEvent Server, “geometry” is just another type of attribute like a String or a Date. Say, for example, you were to create a GeoEvent Definition with two attribute fields – one a Date the other a String – and didn’t include a Geometry field type. You can use any release of GeoEvent after 10.6.1 to publish a feature service to the ArcGIS Server beneath which GeoEvent Server is running. Without a Geometry, to any other client, the feature class's table looks like a non-spatial table. But ArcGIS Server provides a feature service interface to the table allowing you to write the Date and String data you want to it without having a Geometry. You would still configure an Update a Feature output, and made sure to use a Field Mapper to map whatever schema your inbound connector uses to adapt data you receive to the feature service’s schema (PostGRE, for example, insists on using only lower-case field names; Oracle only upper-case field names). You shouldn't have any problem ingesting some sample data that has no geometry and using GeoEvent Server to add/update “feature records” via the feature service. The data records in the RDBMS simply have no geometry … so you won’t be able to add them to a web map as a feature layer for example, but I can query the data at REST. I assume if I had some other database client able to query from the data table I’d be able to retrieve the data that way, rather than going through the feature service’s REST interface to query the data. This would be a lot easier that developing a custom outbound transport which understood how to connect to the RDBMS using ODBC. This is possible, but I don’t know how feasible it is, really, given the inter-dependency of bundles in the Java system framework that underlies GeoEvent Server. GeoEvent Server’s whole design, in this case, assumes that you’ll be able to work through a feature service to access feature records. Hope this information is helpful – RJ
... View more
08-18-2020
03:24 PM
|
0
|
0
|
86
|
Online Status |
Online
|
Date Last Visited |
yesterday
|