|
BLOG
|
ArcGIS 11.1 Updates to the Field Enricher (Feature Service) processor include the ability to select units for the cache refresh time and set the Refresh Rate to as frequent as every 10 seconds. The above cache settings should be used with caution as configuring the processor to frequently query the feature service containing the feature records being used for enrichment can cripple the number of event records you can process through a GeoEvent Service every second. The processor cannot be configured to refresh its cache any faster than once every 10 seconds. Validation is conducted when the GeoEvent Service is published. As @EricIronside mentions above, the cache refresh can be disabled by setting the Cache Refresh Time Interval parameter to 0. This effectively means that when the processor observes an event record with a given attribute join identifier (usually the TRACK_ID), a query to retrieve a single feature record from the feature service will be made and the enrichment data loaded into the processor's cache for that asset record's identifier. The cached value will not be updated as it is set to never expire. Publishing changes to the GeoEvent Service will create a new instance of the processor, whose cache is empty. Stopping and restarting the GeoEvent Server service(s) will also obliterate a Field Enricher (Feature Service) processor instance's cache.
... View more
03-14-2023
04:50 PM
|
1
|
0
|
512
|
|
POST
|
@wizgis -- It's not a problem that the date and time are part of the same data value being ingested. We actually prefer that. The GeoEvent Definition specifies that the input should be adapting data values as a Date. The fact that acquisitionTime is expressed as an ISO 8601 standard value helps guarantee GeoEvent Server's input will be able to adapt the value it receives and create a Date. I'm not sure that you can configure a Filter literally with $RECEIVED_TIME, but you should be able to configure it with the name of the attribute field (e.g. acquisitionTime) as you show in your illustrations. That is how I tested the configuration I included in my previous response as a screenshot. I did perceive what looks like a typo in the one screenshot you shared: I'm sure you took care of that extra bit of punctuation, but I saw it in your screenshot, so I thought I'd mention it. One thing to check is the actual value of the Date your GeoEvent Server adapts from the ISO 8601 value received by the input. Adaption of an ISO 8601 value assumes UTC. Given the input and output below: You'll note that the value 1669849214000 can be represented as either of the following strings: o Wed Nov 30 15:00:14 PST 2022 o Wed Nov 30 23:00:14 GMT 2022 A string representation of the epoch value 1669849214000 must be constructed in order to extract values for hh, mm, and ss to plug into the filter's algebra. The filter, I think, is choosing to construct the string in the server's local time zone -- which for me is the first string above, showing PST. Obviously if you are expecting the filter to only keep values between 05:00 and 17:00 and the filter is using strings constructed in local time, then 1669849214000 and 1669852814000 both satisfy the filter. These two epoch values are 15:00:14 and 16:00:14 (local, PST). The values 1669856414000 and 1669860014000 are both greater than 17:00 once the epochs are converted to strings in the local time zone. The on-line utility https://www.epochconverter.com can help you convert epoch values to string values in both GMT and your local time zone. At this point if you are still having trouble configuring the filter and working with the date/time value conversions, I think it would be helpful for you to open an incident with Esri Technical Support so that someone can work with you, your data, and correlate behavior for your time zone. Hope the above helps -- RJ
... View more
03-06-2023
06:12 PM
|
0
|
0
|
1865
|
|
POST
|
------------------------------------------------- Release: GeoEvent Server 11.1 Processor: GeoTagger ------------------------------------------------- What's wrong with the GeoEvent Service below? Placing any Filter, but especially a spatial filter configured with an Enter or Exit condition, in front of a processor configured to determine Enter or Exit can produce unexpected results if the filter discards an event record the processor needs in order to correctly evaluate its conditional. Suppose we have two geofences, Hotel_028 and Indigo_045 as illustrated below. Two locations of a trackable asset, the point locations #1 and #2 on the map, are received and processed through the GeoEvent Service illustrated above. The intent is to capture information about when the vehicle enters and exits either of the geofences. The vehicle's TRACK_ID has never before been reported, so the filters and processors are observing the vehicle's locations for the first time. Here's the problem: Given that the vehicle's TRACK_ID has never before been observed the default behavior for geofence entry "First GeoEvent triggers Enter" applies. A Filter or Processor evaluating an Enter condition will recognize that the first reported position inside a geofence qualifies as an entry. The upper event processing branch will log the entry of track point #1 as expected. The lower event processing branch will not log an exit because the default for "First GeoEvent triggers Exit" is False. With no prior observations, track point #1 does not satisfy the Exited filter's conditional that a previous event record with the same TRACK_ID was observed "inside" some other geofence, so the filter does not see the event record as exiting an area of interest and discards the event record. When the second track point #2 is received, the Exited filter, having previously observed the prior track point inside the geofence Hotel_028, recognizes the exiting condition and passes the event record along. But the processor following the filter has not seen or evaluated the first track point. With no prior observations, track point #2 does not satisfy the GeoTag OnExit GeoTagger processor's conditional, so the processor does not write the name of a geofence from which the track point has exited into the event record it processes. The fix for this is to rework the event processing on the lower branch to move the filtering operation after the GeoTagger. Illustrated above I have placed two mutually exclusive attribute filters after the GeoTagger processor. The first filter catches the condition when the processor does not recognize an exiting condition and therefore does not write the name of a geofence into an attribute field named GeoTag. Presumably I want to craft some specific message in this case, so I route the event record through a Field Mapper which places a simple message (as a String) into a field to indicate that the current event record has not exited an area of interest. The second filter catches the condition when the processor does recognize an exiting condition, because the GeoTag attribute field holds the name of a geofence (its value is not null). Again, I want to craft some specific message for this case, so I route the event record through a Field Mapper to place a simple message into a field to indicate the date/time the current event record exited an area of interest and the name of the area exited.
... View more
03-03-2023
05:51 PM
|
0
|
0
|
1394
|
|
POST
|
Including the above as text to support keyword search. GeoEvent Simulator Command Line Arguments
Usage options:
GeoEventSimulator.exe [Server:value] [Port:number] [Filename:value] [Skip:number] [Events:number] [Per:number] [Continuous:flag] [Play:value] [TimeField:number] [SetTime:flag]
GeoEventSimulator.exe /?
GeoEventSimulator.exe
Usage examples:
start GeoEventSimulator.exe Server:localhost Port:5565 Filename:vehicles.csv Skip:3 Events:2 Per:5000 Continuous:Once Play:Step TimeField:1 SetTime:Yes
start GeoEventSimulator.exe Server:localhost Port:5565 Filename:vehicles.csv Skip:0 Events:1 Per:1000 Continuous:Cont Play:Play TimeField:0 SetTime:Yes
start GeoEventSimulator.exe Server:localhost Port:5565 Filename:vehicles.csv TimeField:3 RealRateSpeed:200 Continuous:Once Play:play
start GeoEventSimulator.exe Server:localhost Port:5565 Filename:vehicles.csv Play:Play
start GeoEventSimulator.exe Filename:vehicles.csv Play:Play
start GeoEventSimulator.exe Port:5566 Filename:vehicles.csv Play:Play
start GeoEventSimulator.exe
Description:
Using the command line arguments you can run the simulation and optionally:
1. Connect to <ServerHostName>:<ServerPort>.
2. Load simulation file located in <Filename> and skip the first <SkipFirstLineCount> lines.
3. With PlayAction=Play - Start sending <EventsCount> events per <PerMS> milliseconds,
until reaching the last event or continously in a loop, depending on <ContinuousPlayMode>.
4. With PlayAction=Step - Send the first <EventsCount> events once.
Arguments:
Server, ServerHostName Specifies the server host name. Default: {localhost}
Port, ServerPort Specifies the server port number. Default: {5565}
Filename Specifies the full path to the simulation file to load
Skip, SkipFirstLineCount Specifies the number of lines to skip when loading the simulation file. Default: {0}
Events, EventsCount Specifies how many Events to send per interval. Default: {1}
Per, PerMS Specifies the constant interval value in milliseconds. Default: {1000}
Continuous, ContinuousPlayMode Specifies whether to to simulate events in a continuous loop. Default: {Continuous}
Options for Continuous: {Continuous | Cont | C | Yes | Y | True | T}
Options for Once; {Once | O | No | N | False | F}
Play, PlayAction Specifies the play action. Default: {Stop}
Options: {Play | Step | Stop}
TimeField Specifies the zero based index of the Time Field. Default: {1}
RealRateSpeed Specifies the real rate speed factor in percentages. Switches the simulator to Real Rate mode. Default: {100}
SetTime, SetTimeToCurrentTime Specifies whether to override the Time Field value with the current time (in the ISO 8601 format). Default: {No}
Options for yes: {Yes | Y | True | T}
Options for no : {No | N | False | F}
/?, ?, -h, -help, --h, --help Shows this help
Other usage options:
GeoEventSimulator.exe [ServerHostName] [ServerPort] [Filename] [SkipFirstLineCount] [EventsCount] [PerMS] [ContinuousPlayMode] [PlayAction]
Other Usage examples:
start GeoEventSimulator.exe localhost 5565 vehicles.csv 3 2 5000 Once Step
start GeoEventSimulator.exe localhost 5565 vehicles.csv 0 1 1000 Cont Play
start GeoEventSimulator.exe localhost 5565 vehicles.csv 0 1 1000 Continuous
start GeoEventSimulator.exe localhost 5565 vehicles.csv 0 1 1000
start GeoEventSimulator.exe localhost 5565 vehicles.csv 0 1
start GeoEventSimulator.exe localhost 5565 vehicles.csv 0
start GeoEventSimulator.exe localhost 5565 vehicles.csv
start GeoEventSimulator.exe localhost 5565
start GeoEventSimulator.exe localhost
start GeoEventSimulator.exe
... View more
03-03-2023
11:10 AM
|
1
|
0
|
1168
|
|
POST
|
If you open a command-line window and change directory to where the GeoEventSimulator.exe is found (C:\Program Files\ArcGIS\Server\GeoEvent is the default) you can run the simulator from a command line with a variety of arguments. Executing the command GeoEventSimulator.exe --help will display the arguments in a Windows dialog.
... View more
03-03-2023
11:05 AM
|
1
|
1
|
1170
|
|
POST
|
It looks to me like you're using what I refer to as the "algebraic expressions" Morakot describes in his blog How to Create Temporal Filter in GeoEvent The way I read the filter you screenshot above, any event record whose received time is greater-than 1500 hours will be allowed to pass through as well as any event record whose received time is less than 1600 hours. All event records will satisfy that conditional. I think this is more like what you want: When you double-click to edit the filter you need to make sure you apply the logical condition AND to the two expressions you have entered as the filter's criteria:
... View more
03-03-2023
10:46 AM
|
0
|
0
|
1892
|
|
POST
|
The pattern suggested above by @EarlMedina is what you'll need to follow if you're limited to what you can configure out-of-the-box. GeoEvent Server inputs which send HTTP/GET requests to query an external REST API for data need to incorporate all necessary data in the request URL as query parameters. Multiple requests to first obtain authorization and incorporate an authorization token in a second request are not something you can configure out-of-the-box. Available GeoEvent Server inbound connectors do not support the concept of authorization token expiry and refresh (like you might find when working with an authorization protocol or framework such as OAuth). An input such as the Receive JSON on a REST Endpoint, on the other hand, hosts a REST endpoint to which external clients can HTTP/POST data. GeoEvent Server does not offer any sort of authorization or client authentication in this case. If the client can reach the GeoEvent Server's endpoint, the client is allowed to make an HTTP/POST request with a data payload. You can develop a "relay" or "bridge" using any scripting or development SDK you are familiar with to handle the multiple steps required in a publisher/subscriber pattern. The bridge you develop between the data provider and GeoEvent Server would be responsible for sending a request to obtain use authorization and then sending periodic requests using that token to obtain data. The data response could then be relayed to GeoEvent Server via HTTP/POST. The bridge you develop in this case would also handle use authorization token expiry and periodic token refresh. Looking at the contributor history it does not appear that the "Sample Http Inbound Transport" referred to above (http-inbound-auth-token-for-geoevent), available on GitHub and developed for polling a REST endpoint, is being actively maintained. I cannot say whether or not the sample transport can readily be incorporated into an inbound connector, how such a connector was intended to be used, what assumptions it might make, or whether it will work with a particular release of ArcGIS Enterprise. @EricIronside @MorakotPilouk
... View more
01-05-2023
11:37 AM
|
1
|
1
|
4171
|
|
POST
|
Apologies for the delay in getting back with you John. The latest version (Release 5 - July 27, 2021) currently on the AIS Connector page should work with an ArcGIS 11 deployment. We addressed the indications we had with the connector without needing to recompile a new version. If you have problems deploying the connector or getting it to successfully adapt data being received, please open an incident with Esri Technical Support so that we can track the issue.
... View more
12-05-2022
09:31 AM
|
0
|
0
|
1608
|
|
BLOG
|
@BrunoGomesdeSouza -- Please submit this as a formal enhancement request through Esri Technical Support. The product development team will consider the enhancement to GeoEvent Server to support client requests it makes to an organization's SMTP server. It is possible, however, that the organization's administrator will need to disable SMTP authentication (along with StartTLS) for specific mailboxes.
... View more
11-17-2022
08:52 AM
|
2
|
0
|
1792
|
|
POST
|
Hello @TedCronin ... deploying multiple ArcGIS Server instances which coordinate through a single server site with an ArcGIS GeoEvent Server advanced server role licensed for each instance was a pattern the development team has been advising against for a few years now. We formally removed the capability of configuring such a deployment from the 10.9 release. In the context of this thread this would be the 'site' approach. The tutorial describing this deployment approach was removed from the GeoEvent Server Gallery. The description of the other approach, referred to as the 'silo' approach, as laid out in the second tutorial has changed since over the last couple of years. The second tutorial was depreciated and delisted as well to try and prevent confusion moving forward. The basic concept, however, that each GeoEvent Server instance you deploy needs to run independently (not collaboratively with another instance) is the recommended deployment model. Please refer to updated help topics using the links below: Deployment considerations Best practices for system architecture Advanced deployment concepts and scenarios Strategies for scalability, reliability, and resiliency If you would like to schedule a brief consultation on this please e-mail the team at GeoEventServer@esri.com
... View more
09-16-2022
02:43 PM
|
0
|
0
|
6741
|
|
POST
|
We use automated tests that identify whether or not a given custom processor or connector will successfully deploy. Generally if the connector refuses to deploy it is because something in the GeoEvent Server SDK has changed and the connector needs to be recompiled. There were a dozen components identified for ArcGIS 11 which need to be recompiled. Unfortunately the AIS connector is one of them. Part of recompilation includes documentation review and regression test. Our apologies for the delay, but a new version of the AIS connector will need to be produced and uploaded. We are working on that. @EricIronside @TravisShore
... View more
08-09-2022
08:02 AM
|
0
|
0
|
1706
|
|
BLOG
|
While the steps to administratively reset ArcGIS GeoEvent Server don't change very much from release-to-release, you are working with the system files and folders which constitute the software product. Changes to the steps are inevitable. Procedures for an administrative reset are therefore being written-up as version specific PDF files. Each downloadable PDF starts with an identification of "What Changed" at a specific release to require a change in the scripted steps. Each document includes things you should understand before executing an administrative reset. The actual administrative reset steps are on the last page(s) of each document. Please download the PDF document which is most appropriate to your software release.
... View more
08-08-2022
04:23 PM
|
13
|
0
|
16706
|
|
POST
|
At the 10.8.1 release I believe you need to use an expression whose constant value is a decimal value (rather than an integer value). The duration field is a Long. The Field Calculator is therefore evaluating the expression ( duration / 60000 ) as ( Long / Long ) and returning a Long. Try an expression like: duration / 60000.0 Adding the '.0' to the constant should force the Field Calculator to implicitly cast duration to a Double and evaluate ( Double / Double ) returning a Double, which will then be implicitly cast to a Float since that is the data type of the field into which you are writing the result. When up upgrade to 10.9.1 you will have additional explicit type/cast functions such as toDouble() and toFloat(). The 10.9.1 type/cast functions could be used to clarify the expression as you would be able to write something like: toFloat(duration) / 60000.0 and remove some of the implicit type/cast ambiguity. -- RJ
... View more
07-05-2022
08:37 AM
|
1
|
0
|
1311
|
|
BLOG
|
How To: Transform a delimited string of key/value pairs placing each "value" in a separate attribute with the "key" as the attribute name To make this work out-of-the-box you are going to have to be able to make a few assumptions about the key/value pairs being received as an array of values. The number of key/value pairs in the array will need to be constant The ordering of the key/value pairs will also need to be consistent The key/value pairs themselves are clearly delimited One approach would be to use a pair of Field Mapper processors in series. Depending on the data being received, this should get you most of the way to where you want to be. There may be changes you will need to make to the example below to accommodate quotes embedded in the received data and/or null values, for example, if these are present in your data. Illustrated below, the suggested approach uses a Field Mapper to first extract the delimited key/value pairs into separate attribute values. A second Field Mapper then strips the "key" from each key/value pair leaving only the "value" in the attribute. Consider the following: Data -- A string received as JSON {"DataString": "Alpha=Apple;Bravo=Baby;Charlie=Candy;Delta=Diamond"} Expresson Patterns -- Use back-references to extract each key/value pair First Step replaceAll(DataString, '^(.*)[;](.*)[;](.*)[;](.*)', '$1') Second Step replaceAll(Alpha, '.*=', '') Configuration of the two Field Mapper processors would look something like: Can these two operations be combined so that only one Field Mapper is used? Excellent question. Yes, functions you incorporate into an expression can be nested. The value calculated when an "inner" expression is evaluated can be used as an argument by an "outer" function to further process the data. The first step, above, would produce something like: The delimiters have been removed and each key/value is now in its own attribute. The second step then removes the "key" and the literal '=' in each key/value pair so that only the "value" remains. If we combine the expressions above like so: replaceAll(replaceAll(DataString, '^(.*)[;](.*)[;](.*)[;](.*)', '$1'), '.*=', '') replaceAll(replaceAll(DataString, '^(.*)[;](.*)[;](.*)[;](.*)', '$2'), '.*=', '') replaceAll(replaceAll(DataString, '^(.*)[;](.*)[;](.*)[;](.*)', '$3'), '.*=', '') replaceAll(replaceAll(DataString, '^(.*)[;](.*)[;](.*)[;](.*)', '$4'), '.*=', '') We can effectively, and efficiently, pull each key/value pair using a back-reference to a capture group then execute a second replaceAll( ) on the extracted key/value pair to remove the "key" and keep the "value". When the DataString is ingested the following is produced, with just the values in each attribute: This is how you implement a "pivot" on a serialized data string to pull apart the key/value pairs.
... View more
02-17-2022
04:47 PM
|
0
|
0
|
1754
|
|
POST
|
Hello @ZainabAkhtar -- Adding to what Jake and Dan have said above, the fact that Elasticsearch is the search and analytics engine for Esri's Spatiotemporal Big Data Store is an implementation detail. You should consider the SBDS another type of Enterprise geodatabase, a capability you configure when installing ArcGIS Data Store. Direct connections to the Elasticsearch engine are not supported. The only supported way to connect and work with data stored in the Spatiotemporal Big Data Store is through tools included with the different ArcGIS Server advanced server roles (e.g. GeoEvent Server, GeoAnalytics, etc.) Typically client access is limited to the REST interfaces exposed through map/feature services published as you create/publish hosted feature layers. Your only option for direct connection to Elasticsearch would be to contract with Esri Professional Services for implementation assistance in developing a custom extension to the ArcGIS Enterprise. Hope this information helps -- RJ
... View more
01-18-2022
09:09 AM
|
0
|
0
|
3682
|
| Title | Kudos | Posted |
|---|---|---|
| 1 | 02-11-2026 11:38 AM | |
| 1 | 02-11-2026 10:38 AM | |
| 1 | 01-05-2023 11:37 AM | |
| 1 | 02-20-2025 03:50 PM | |
| 1 | 08-31-2015 07:23 PM |
| Online Status |
Offline
|
| Date Last Visited |
02-17-2026
02:45 PM
|