POST
|
If you open a command-line window and change directory to where the GeoEventSimulator.exe is found (C:\Program Files\ArcGIS\Server\GeoEvent is the default) you can run the simulator from a command line with a variety of arguments. Executing the command GeoEventSimulator.exe --help will display the arguments in a Windows dialog.
... View more
03-03-2023
11:05 AM
|
1
|
1
|
284
|
POST
|
It looks to me like you're using what I refer to as the "algebraic expressions" Morakot describes in his blog How to Create Temporal Filter in GeoEvent The way I read the filter you screenshot above, any event record whose received time is greater-than 1500 hours will be allowed to pass through as well as any event record whose received time is less than 1600 hours. All event records will satisfy that conditional. I think this is more like what you want: When you double-click to edit the filter you need to make sure you apply the logical condition AND to the two expressions you have entered as the filter's criteria:
... View more
03-03-2023
10:46 AM
|
0
|
0
|
443
|
POST
|
The pattern suggested above by @EarlMedina is what you'll need to follow if you're limited to what you can configure out-of-the-box. GeoEvent Server inputs which send HTTP/GET requests to query an external REST API for data need to incorporate all necessary data in the request URL as query parameters. Multiple requests to first obtain authorization and incorporate an authorization token in a second request are not something you can configure out-of-the-box. Available GeoEvent Server inbound connectors do not support the concept of authorization token expiry and refresh (like you might find when working with an authorization protocol or framework such as OAuth). An input such as the Receive JSON on a REST Endpoint, on the other hand, hosts a REST endpoint to which external clients can HTTP/POST data. GeoEvent Server does not offer any sort of authorization or client authentication in this case. If the client can reach the GeoEvent Server's endpoint, the client is allowed to make an HTTP/POST request with a data payload. You can develop a "relay" or "bridge" using any scripting or development SDK you are familiar with to handle the multiple steps required in a publisher/subscriber pattern. The bridge you develop between the data provider and GeoEvent Server would be responsible for sending a request to obtain use authorization and then sending periodic requests using that token to obtain data. The data response could then be relayed to GeoEvent Server via HTTP/POST. The bridge you develop in this case would also handle use authorization token expiry and periodic token refresh. Looking at the contributor history it does not appear that the "Sample Http Inbound Transport" referred to above (http-inbound-auth-token-for-geoevent), available on GitHub and developed for polling a REST endpoint, is being actively maintained. I cannot say whether or not the sample transport can readily be incorporated into an inbound connector, how such a connector was intended to be used, what assumptions it might make, or whether it will work with a particular release of ArcGIS Enterprise. @EricIronside @MorakotPilouk
... View more
01-05-2023
11:37 AM
|
0
|
1
|
955
|
POST
|
Apologies for the delay in getting back with you John. The latest version (Release 5 - July 27, 2021) currently on the AIS Connector page should work with an ArcGIS 11 deployment. We addressed the indications we had with the connector without needing to recompile a new version. If you have problems deploying the connector or getting it to successfully adapt data being received, please open an incident with Esri Technical Support so that we can track the issue.
... View more
12-05-2022
09:31 AM
|
0
|
0
|
430
|
BLOG
|
@BrunoGomesdeSouza -- Please submit this as a formal enhancement request through Esri Technical Support. The product development team will consider the enhancement to GeoEvent Server to support client requests it makes to an organization's SMTP server. It is possible, however, that the organization's administrator will need to disable SMTP authentication (along with StartTLS) for specific mailboxes.
... View more
11-17-2022
08:52 AM
|
2
|
0
|
273
|
POST
|
Hello @TedCronin ... deploying multiple ArcGIS Server instances which coordinate through a single server site with an ArcGIS GeoEvent Server advanced server role licensed for each instance was a pattern the development team has been advising against for a few years now. We formally removed the capability of configuring such a deployment from the 10.9 release. In the context of this thread this would be the 'site' approach. The tutorial describing this deployment approach was removed from the GeoEvent Server Gallery. The description of the other approach, referred to as the 'silo' approach, as laid out in the second tutorial has changed since over the last couple of years. The second tutorial was depreciated and delisted as well to try and prevent confusion moving forward. The basic concept, however, that each GeoEvent Server instance you deploy needs to run independently (not collaboratively with another instance) is the recommended deployment model. Please refer to updated help topics using the links below: Deployment considerations Best practices for system architecture Advanced deployment concepts and scenarios Strategies for scalability, reliability, and resiliency If you would like to schedule a brief consultation on this please e-mail the team at GeoEventServer@esri.com
... View more
09-16-2022
02:43 PM
|
0
|
0
|
2573
|
POST
|
We use automated tests that identify whether or not a given custom processor or connector will successfully deploy. Generally if the connector refuses to deploy it is because something in the GeoEvent Server SDK has changed and the connector needs to be recompiled. There were a dozen components identified for ArcGIS 11 which need to be recompiled. Unfortunately the AIS connector is one of them. Part of recompilation includes documentation review and regression test. Our apologies for the delay, but a new version of the AIS connector will need to be produced and uploaded. We are working on that. @EricIronside @TravisShore
... View more
08-09-2022
08:02 AM
|
0
|
0
|
528
|
BLOG
|
While the steps to administratively reset ArcGIS GeoEvent Server don't change very much from release-to-release, you are working with the system files and folders which constitute the software product. Changes to the steps are inevitable. Procedures for an administrative reset are therefore being written-up as version specific PDF files. Each downloadable PDF starts with an identification of "What Changed" at a specific release to require a change in the scripted steps. Each document includes things you should understand before executing an administrative reset. The actual administrative reset steps are on the last page(s) of each document. Please download the PDF document which is most appropriate to your software release.
... View more
08-08-2022
04:23 PM
|
10
|
0
|
7746
|
POST
|
At the 10.8.1 release I believe you need to use an expression whose constant value is a decimal value (rather than an integer value). The duration field is a Long. The Field Calculator is therefore evaluating the expression ( duration / 60000 ) as ( Long / Long ) and returning a Long. Try an expression like: duration / 60000.0 Adding the '.0' to the constant should force the Field Calculator to implicitly cast duration to a Double and evaluate ( Double / Double ) returning a Double, which will then be implicitly cast to a Float since that is the data type of the field into which you are writing the result. When up upgrade to 10.9.1 you will have additional explicit type/cast functions such as toDouble() and toFloat(). The 10.9.1 type/cast functions could be used to clarify the expression as you would be able to write something like: toFloat(duration) / 60000.0 and remove some of the implicit type/cast ambiguity. -- RJ
... View more
07-05-2022
08:37 AM
|
1
|
0
|
350
|
BLOG
|
How To: Transform a delimited string of key/value pairs placing each "value" in a separate attribute with the "key" as the attribute name To make this work out-of-the-box you are going to have to be able to make a few assumptions about the key/value pairs being received as an array of values. The number of key/value pairs in the array will need to be constant The ordering of the key/value pairs will also need to be consistent The key/value pairs themselves are clearly delimited One approach would be to use a pair of Field Mapper processors in series. Depending on the data being received, this should get you most of the way to where you want to be. There may be changes you will need to make to the example below to accommodate quotes embedded in the received data and/or null values, for example, if these are present in your data. Illustrated below, the suggested approach uses a Field Mapper to first extract the delimited key/value pairs into separate attribute values. A second Field Mapper then strips the "key" from each key/value pair leaving only the "value" in the attribute. Consider the following: Data -- A string received as JSON {"DataString": "Alpha=Apple;Bravo=Baby;Charlie=Candy;Delta=Diamond"} Expresson Patterns -- Use back-references to extract each key/value pair First Step replaceAll(DataString, '^(.*)[;](.*)[;](.*)[;](.*)', '$1') Second Step replaceAll(Alpha, '.*=', '') Configuration of the two Field Mapper processors would look something like: Can these two operations be combined so that only one Field Mapper is used? Excellent question. Yes, functions you incorporate into an expression can be nested. The value calculated when an "inner" expression is evaluated can be used as an argument by an "outer" function to further process the data. The first step, above, would produce something like: The delimiters have been removed and each key/value is now in its own attribute. The second step then removes the "key" and the literal '=' in each key/value pair so that only the "value" remains. If we combine the expressions above like so: replaceAll(replaceAll(DataString, '^(.*)[;](.*)[;](.*)[;](.*)', '$1'), '.*=', '') replaceAll(replaceAll(DataString, '^(.*)[;](.*)[;](.*)[;](.*)', '$2'), '.*=', '') replaceAll(replaceAll(DataString, '^(.*)[;](.*)[;](.*)[;](.*)', '$3'), '.*=', '') replaceAll(replaceAll(DataString, '^(.*)[;](.*)[;](.*)[;](.*)', '$4'), '.*=', '') We can effectively, and efficiently, pull each key/value pair using a back-reference to a capture group then execute a second replaceAll( ) on the extracted key/value pair to remove the "key" and keep the "value". When the DataString is ingested the following is produced, with just the values in each attribute: This is how you implement a "pivot" on a serialized data string to pull apart the key/value pairs.
... View more
02-17-2022
04:47 PM
|
0
|
0
|
975
|
POST
|
Hello @ZainabAkhtar -- Adding to what Jake and Dan have said above, the fact that Elasticsearch is the search and analytics engine for Esri's Spatiotemporal Big Data Store is an implementation detail. You should consider the SBDS another type of Enterprise geodatabase, a capability you configure when installing ArcGIS Data Store. Direct connections to the Elasticsearch engine are not supported. The only supported way to connect and work with data stored in the Spatiotemporal Big Data Store is through tools included with the different ArcGIS Server advanced server roles (e.g. GeoEvent Server, GeoAnalytics, etc.) Typically client access is limited to the REST interfaces exposed through map/feature services published as you create/publish hosted feature layers. Your only option for direct connection to Elasticsearch would be to contract with Esri Professional Services for implementation assistance in developing a custom extension to the ArcGIS Enterprise. Hope this information helps -- RJ
... View more
01-18-2022
09:09 AM
|
0
|
0
|
1625
|
POST
|
Hello Drew -- What you want to do is not possible using only a Filter and Output when writing data out to a JSON or CSV (delimited text) file. Event records received and adapted by an input should be considered atomic. In other words, a GeoEvent Service (generally) doesn't cache or retain any information about event records previously processed. You cannot compare one event record to another, so there is no way to determine that an event record whose attribute Name matches 'John' is similar to another event record -- at least not using a Filter. The Event Volume Controller processor does cache data in order to emit only one one record matching some filterable criteria every "so many seconds". This doesn't do exactly what you want -- to emit exactly one event record and discard all duplicates. I mention it only to highlight that you need something that caches and holds event records in order to compare one to another. I think what you probably want to do is use data from event records being processed to update a feature record, via a feature service, in a geodatabase. You can choose to update just the feature record's date/time attribute, using the attribute field Name as the TRACK_ID to determine which "user" feature record to update. I would probably choose to record the date/time the data was last observed and the feature record updated. You can then script a query to pull data from the feature service and create the "logged once only" list of data records with specific user names. However, since GeoEvent Server (generally) receives, adapts, and processes every data record independently, every data record matching a Filter you configure will be routed through whatever workflow your GeoEvent Service is configured to perform and direct all processed event records to an Output. If that output writes data to a system file, every data record with a Name matching 'John' will be written to the file. If that output updates a database record, every data record with a Name matching 'John' will be used to update a record in the database. Hope this information helps -- RJ
... View more
12-30-2021
08:45 AM
|
0
|
0
|
357
|
POST
|
Hey Jamie -- ... those two fields are updated in the hosted feature [but] all other fields are overwritten to null. I don't understand why fields not mapped are being updated. Is it even possible to update one field and leave the rest untouched? Is the Field Enricher (Feature Service) processor required? I'm not sure the question above was ever answered, so as a follow-up: You are correct when you said, earlier, that the purpose of a Field Mapper is to copy data from one data structure / schema / GeoEvent Definition to another. You generally use a Field Mapper to simplify and/or flatten a data structure to match the schema a Feature Service expects. That way, when an outbound connector (or "Output") deconstructs an event record to form a REST request, it can send the request to a feature service and the data in the client request will conform to the feature service's requirements. It is possible to update a single feature record attribute (or just a few attributes) and leave others untouched. I would refer you to the article Using a partial GeoEvent Definition to update feature records for details on how to do this. Bottom line up front: If you leave fields unmapped in a Field Mapper configuration you are actually writing null values into the target data structure. So, yes, the null values end up being used to update feature record attributes overwriting whatever data the feature record had. A Field Enricher (Feature Service) processor does something very different from a Field Mapper. The latter copies data from an event record to organize existing data in a different data structure. A Field Enricher performs an attribute join between an event data record and an external data record to incorporate data not present in an event record (enriching the event record with related data). Hope some of this helps clarify tasks the processors you are configuring are actually doing. -- RJ
... View more
12-30-2021
08:15 AM
|
0
|
0
|
952
|
POST
|
@JamieLambert -- Have you taken a look at the blog Debug Techniques - Add/Update Feature Outputs? If you request DEBUG logging on the com.esri.ges.transport.featureService.FeatureServiceOutboundTransport component logger you can see the REST requests your configured Update a Feature output is sending to the ArcGIS Server feature service. The DEBUG logs in GeoEvent Server give a lot of information you can use to figure out what's missing from the REST requests. The logging was reworked for the 10.8.1 release with even more information provided via TRACE level logging, if you need the component logger to be most verbose. You can also work with the updateFeatures endpoint for the feature service in the ArcGIS REST Services Directory (taking GeoEvent Server out of the equation for now, while debugging) to iterate and try different requests to see what a web client needs to send to be able to successfully update feature records.
... View more
12-21-2021
12:56 PM
|
1
|
1
|
275
|
POST
|
Hello Walter -- The short answer, I think, is that GeoEvent Server displays Java data types in its pick list: When you configure a GeoEvent Definition you choose a data type for an event record attribute from one of the available types above. The esriFieldTypeXXX date types come into play when looking at an ArcGIS Enterprise hosted feature layer or feature service which stores feature records in an enterprise geodatabase. The .NET API help topic lists the supported data types for feature record attributes. Several of these GeoEvent Server does not support (like esriFieldTypeBlob). Others, like esriFieldTypeInteger, are compatible with a GeoEvent Definition specifying the ‘Java’ data type Integer. It can get a little tricky trying to keep the differences between the 64-bit Java data types used by GeoEvent Server and the 32-bit data types used by other parts of the ArcGIS Enterprise straight. Luckily, date/time values are always compatible when using a ‘Java’ type Date and ArcGIS Enterprise esriFieldTypeDate data type. So long as the event records you route to a Send Features to a Stream Service output (for example) have date/time values mapped as Date then the outbound adapter should be able to construct a data record formatted as Esri Feature JSON representing the date/time as an epoch long integer value (a 13-digit integer representing milliseconds since the Unix Epoch). That is what you should see in the ArcGIS REST Services Directory when subscribing to a stream service to see feature records the service broadcasts. It is also what you see when querying a feature service “at REST” using the query endpoint exposed by the feature service. What you see when looking at a pop-up in a web mapping application is up to the web application -- it probably is going to represent the epoch value it gets from the database as a more human readable string, and might even do you the favor of shifting the value from UTC to the local time of your client/server machine. I’m not familiar with the GeoEvent Server component logger StreamService-IRIS-delete captured in your screenshot. I would want to take a look at how the event processing upstream from the Send Features to a Stream Service output has been configured. I’m just guessing, but maybe by the time the event record reaches the output the data type for unit-activitydatetime has changed? Maybe it is a String or a Long integer value rather than a Date? The product team does rely on Esri Tech Support to work through these type of issues with customers, so I would encourage you to open an incident with Esri Tech Support so that an analyst can be assigned to investigate the issue and assist with reproducibility (if necessary). If you think it necessary you can request a GeoEvent Server specialist be assigned when you open an incident with Esri Technical Support. Hope this information helps -- RJ
... View more
12-16-2021
11:10 AM
|
1
|
0
|
756
|
Title | Kudos | Posted |
---|---|---|
1 | 01-05-2024 02:25 PM | |
1 | 01-09-2024 09:04 AM | |
1 | 01-08-2024 04:01 PM | |
1 | 10-16-2023 12:59 PM | |
1 | 07-03-2023 10:08 AM |
Online Status |
Offline
|
Date Last Visited |
03-07-2024
07:12 PM
|