BLOG
|
Back in June of 2019, at the 10.7.1 release of ArcGIS GeoEvent Server, we shared the news of new documentation for the available out-of-the-box input and output connectors. Today, we are happy to announce new and expanded documentation for each of the available processors in GeoEvent Server 10.8. Like the connector documentation, each processor now has expanded information to help you properly configure the processor. Access the new processor documentation by visiting What are Processors? where you can access links to each individual processor you may be interested in. Alternatively, you can access the new processor help directly in ArcGIS GeoEvent Manager when editing or configuring a new processor. Simply click the Help button in the processor property dialog to expand the embedded help. It's important to note that even though this new documentation is available with the 10.8 release, the information is still applicable to previous versions as well. The new processor documentation follows the same format and style as the input and output connector documentation that you may already be familiar with. First, you’ll see a general summary as well as some example use cases for each processor. Beneath that you'll find detailed usage notes. These notes are intended to provide some extra contextual information for successful configuration of each processor. Following the usage notes, is a parameters table that includes detailed information about the parameters for each processor, what each parameter does, options to configure it, and more. The parameter table list all of the available parameters, meaning you can learn about all the parameters that are shown by default in additional to all of the conditional parameters that are initially hidden due to their dependencies on other parameters. We recognize that many of the available processors have their own nuances and quirks, so we’ve included a final section that details various considerations and limitations. This information is intended to provide some additional context about how the processor fundamentally works, and therefore, will hopefully help guide how you approach incorporating it into your GeoEvent Services. As with our input and output connectors, you can find step-by-step tutorials on how to configure many of the available processors on the ArcGIS GeoEvent Server Gallery.
... View more
02-13-2020
03:02 PM
|
1
|
0
|
524
|
POST
|
Hi James, You're welcome. To answer your question about whether filters can cause a service to slow down, the short answer is yes* (* with some nuances to consider). Any time that you add filter or processor elements to a GeoEvent Service, you are effectively introducing additional processing overhead cost. Each element requires a varying amount of system resources in order to perform its designated work on the inbound real-time data. When we start to take into account the rate, schema, and size of the data (as well as other external factors such as network latency and system resources), this type of discussion can very quickly become more nuanced. I don't bring this information up to give the false impression that a single filter tanks the performance of a GeoEvent Service. That's not true. Rather, the takeaway here should be that all of these variables can have varying effects on the total throughput of a GeoEvent Service (and by extension, a GeoEvent Server site overall). While it's certainly best to consider the above points when building out a GeoEvent Service, I don't think its directly related to the problem you described. What strikes me as interesting is what you said about how the behavior improves when three of your four processors (I read this as routes) are disabled. This tells me that there might be an issue with the design of your GeoEvent Service as it relates to event record throughput. It's important to consider how GeoEvent Server handles event records when there's more than one element in the service to send data to. Take your GeoEvent Service for example. You have one input connector (named "ENRICHED_JSON") that is configured to send data down four separate paths to four different elements (PDS Filter, SUMMARY_FILTER, filter, and the All Message Mapper). If your data is coming in at a rate of 1000 events per second for example, GeoEvent Server is actually taking a copy of those 1000 records and is sending them to each of the 4 elements. At any given point in time, this actually works out to 4000 events being processed per second since the data that is coming in is replicated for each route. The filters might be discarding a lot of the data downstream, but that doesn't change the initial pressure at the start of the service. It makes sense to think that the bottom route is handling 'the most' data, but the truth of the matter is that each of the first four elements is handling the same amount of data (they're just doing different things with it). Its the 4th element that is passing the most data through. If you're interested in learning more about GeoEvent Server best practices, ideal service design, performance, etc, I would recommend that you check out some of our best practice conference session recordings (example). If you'd like to send me a direct message, perhaps we can set up some time to discuss this in more detail. I still have some questions about the behavior you're seeing happen, the rate of your data, what the end-goal is as far as writes to the feature service are concerned, etc. I don't want to mislead you with any particular advice if the issue is in fact related to something else entirely that wasn't covered here. Let me know.
... View more
12-18-2019
08:12 AM
|
0
|
0
|
1210
|
POST
|
Hi James, To help clarify, are you looking to treat all objects in your JSON array as a single record (analogous to a single row in a database table), or are you looking to simply 'queue' individual records (objects in the array) in such a way that any adds/edits to your feature service is done in 'bulk'? You had asked: I would like to know if there is a way to treat all of the array elements as a single event which in turn would lead to a single transaction. This would imply that you want to merge all array elements (or objects) so that they are a single event record (i.e. row in a database table). Likewise you had also asked: Again, I would like to insert multiple rows in a single transaction using the add feature output connector. This would imply that you want to maintain each array element as its own respective event record. Rather then adding these one-by-one in real-time, you would like to add several in bulk. No? I assume you are asking about how to achieve the latter scenario, but I would like to be completely sure since both questions entail entirely different responses. If I am to assume the latter, you might want to look into the Add a Feature output connector properties, "Update Interval (seconds)" and, as you've pointed out already, "Maximum Features Per Transaction". The "Update Interval (seconds)" parameter is important here since it can be used to specify how frequently the output connector will essentially queue processed event records, make a request to the feature service of interest to add those processed event records as new feature records, and then finally flush its cache to make room for additional processed event records. By default, this property is set to 1 second but if you wanted to, you could adjust this to be something of your choosing like 2 minutes (120 seconds). Using 120 seconds as an example, GeoEvent Server will queue all processed event records (from your input connector) for 2 minutes. Once the 2 minute window has lapsed, GeoEvent Server will make a request to the feature service to add its queued event records as new feature records. GeoEvent Server will then flush its cache to make room for the next set of processed event records only to repeat the same process 120 seconds later. The important thing to keep in mind with this property is that it works using an iterating time interval, not record count (!= 'add every time 200 event records have been queued'). If you know that your data is coming into GeoEvent Server at a consistent rate (e.g. 20 event records per second), you could hypothetically adjust this property to match what you're looking for in respect to a certain 'transaction size'. Of near equal importance here is the "Maximum Features Per Transaction" property. This property controls for how many records to include in any single feature service request to add new feature records. By default this is set to 500 which means that each add request will contain a maximum of 500 records. If we go back to the above example involving a 120 second update interval, let's say you've queued 1800 records over that period of time. This property will ensure that 4 transactions are made to the feature service (500 + 500 + 500 + 300 = 1800) to add new feature records. It's important to note that this property doesn't provide inhibit the number of requests made (!= 'only make 2 requests at the maximum transaction size regardless of how much data there is still'). You might be thinking that this property isn't very useful for your scenario, and that might be true, but I thought it is important to share in case you need the requests being made to your feature service to generally contain a certain record count (and nothing more). Realistically, this property was intended to help reduce the burden on an external server/service by providing transaction management. If you're looking to find out more information regarding the former scenario, please feel free to let me know.
... View more
12-12-2019
07:02 AM
|
0
|
2
|
1210
|
POST
|
Hi khadija F To confirm whether or not the input connector is working as it should, we would likely want to see how your data is formatted in the CSV. That'll help confirm the validity of some minor settings you've configured. That aside, what immediately stands out to me about your configuration is your usage of the Input Directory property. You probably want to either leave that blank or change it altogether if you haven't tried doing so already. More on why below. When working with the "Watch a Folder for New CSV Files" input connector, there are really two properties that tell GeoEvent where to find the CSV. The first is the Input Folder Data Store. The Input Folder Data Store is essentially a path to a folder location that has been registered with GeoEvent Server. An example of a registered data store folder is the "Automatic Backups" folder included with the installation of GeoEvent Server. "Automatic Backups" points to the path C:\ProgramData\Esri\GeoEvent. The second property is Input Directory (optional). The Input Directory property tells GeoEvent Server where it can look relative to the specified Input Folder Data Store path. In the case of your configuration, you specified the Input Folder Data Store path as "IVMS". For example purposes, lets say the corresponding path for "IVMS" is actually E:\TestFolder (just as "Automatic Backups" points to "C:\ProgramData\Esri\GeoEvent"). At the same time, you've also configured the Input Directory property to point to "C:\GEOEVENT\INPUT. This is the location relative to E:\TestFolder that you've specified. Considering the two factors above, you are essentially telling GeoEvent Server to look at E:\TestFolder\C:\GEOEVENT\INPUT for your CSV. I could be mistaken here, but I doubt that path exists (or is the path you wanted to specify). This is the likely reason why nothing is coming through. You should only specify an Input Directory when the registered folder data store isn't the exact location of your file that you want GeoEvent Server to read from. I often specify the exact folder location of my content (as a registered folder) and therefore don't need to specify an Input Directory path. However, its worth mentioning that the property exists so that if somebody has a registered folder data store set to "C:\GeoEvent" for example, they could potentially configure several other input connectors to point to different sub-folder locations underneath C:\GeoEvent without having to register just as many folders. For example, lets say I have one folder for CSV data (called CSV_Folder) and another folder for JSON data (called JSON_Folder) under my hypothetical registered C:\GeoEvent data store folder. If I were configuring a new "Watch a Folder for New CSV Files" input connector, I could configure my Input Directory as CSV_Folder relative to the registered C:\GeoEvent data store folder (i.e. GeoEvent would look at C:\GeoEvent\CSV_Folder for CSV files). Likewise, I could configure a separate "Watch a Folder for New JSON Files" input connector to use JSON_Folder as its Input Directory relative to the registered C:\GeoEvent data store folder (i.e. GeoEvent would look at C:\GeoEvent\JSON_Folder for JSON files). TLDR: To simplify things a bit, I suggest creating a new registered folder (Input Folder Data Store) that points to the exact path of where your CSV resides. Once you have done that, I would keep the Input Directory property empty. You may need to let GeoEvent Server create a GeoEvent Definition initially to make sure that the data can be brought in once the path is correct.
... View more
10-14-2019
05:51 AM
|
1
|
0
|
1002
|
POST
|
Hi Patricia, One thing you could check is the mime type included as part of the polling request. Is there a specific mime type used by your API that isn't either of the defaults included with the "Poll an External Website for JSON" input connector? You could monitor for the response from the API request in the browser to help with determining this. Normally I would expect a status of 415 for this to be the issue, but I am not sure what the rest of your logs say. Nevertheless, I thought I would put this out there based on some of my experience with this type of behavior. If you haven't done so already, I might also suggest trying to enter the entire request URL in the URL parameter as opposed to segmenting it via the other parameter properties. This helps reduce the likelihood of an incorrect URL being provided (this is unlikely). Ideally we would want to take a closer look at the rest of your logs to see what exactly may be going on here. It might be worth reaching out to Esri Support Services to assist with this if no other suggestions are of immediate benefit. Kind regards Gregory
... View more
10-08-2019
05:37 AM
|
2
|
2
|
2082
|
POST
|
Hello Eric, You're welcome. I am glad that I was able to help pull the blinders off. Feel free to reach out if you have any further questions about this particular problem. I am happy to be of help.
... View more
10-07-2019
09:39 AM
|
0
|
0
|
1625
|
POST
|
There's some details about your environment that we would likely want to clarify, but generally speaking you will want to use GeoEvent Server to apply real-time adds or updates to a feature service whose underlying data source references tables in your SQL database (more or less). GeoEvent Server itself doesn't make a direct connection to any sort of databases. Rather, it uses user-configured server connections (ArcGIS Online, ArcGIS Enterprise, ArcGIS Server), and the map/feature services available from those connections, to facilitate adds/updates to the underlying databases registered with those 'servers' (e.g. the ArcGIS Data Store as a managed database, SQL Server as a database, Oracle as a database, hosted in ArcGIS Online, etc). In brief, adding, removing, or updating data is handled through the feature services which expose the underlying data in a database. I am not familiar with the ins and outs of your environment, but generally speaking I could see a hypothetical workflow here where you create a registered server connection in GeoEvent that points to your ArcGIS Enterprise. From there, you can publish feature services (or use existing feature services) which reference some sort of enterprise geodatabase, like SQL in your case, that has already been configured with your ArcGIS Enterprise's hosting server. As real-time information comes into GeoEvent Server about your snow plows, you can set up GeoEvent Services to perform analysis on that data and then use that new information to apply updates to the feature service you published (which is essentially pointing back to a table in SQL). One thing you mentioned was creating spatial views between the event features and the street line in SQL to then publish a street service symbolized by treatment time. While this is certainly a possibility, it is worth pointing out that this type of problem can be solved directly within a GeoEvent Service using its analytic capabilities. For example, it wouldn't be a far fetched idea to regularly poll buffered street data (lets say every 5 seconds) as the event features. Snow plows could be brought into the same GeoEvent Service as dynamic point geofences. Using a spatial filter, the buffered street segments could be evaluated every 5 seconds to see if they "contain" any snow plow. Assuming a street segment "contains" a snowplow, we can configure the next process of our GeoEvent Service to update the status of that street to "recently cleared in the last..." since the intersection occurred with the snow plow. The output of this hypothetical service would be your streets data with an updated treatment status attribute in real-time. If you were using this output to update a feature service, and that feature service was already in a web map, you could visualize the real-time treatment status of your roads based on a symbology that uses the changing field. We can fine tune other aspects of our analysis to account for time, road segmentation, whether or not the plow has its arm down, etc, but I wanted to throw this out there to demonstrate one of many potential approaches to this problem. Some of this might sound like jargon if you're new to GeoEvent Server and that's okay. If you are new to GeoEvent Server, I would recommend working through the introductory tutorial series found here. It'll give you a better idea of how GeoEvent Server works and how it fits into the broader ArcGIS Enterprise.
... View more
09-11-2019
07:51 AM
|
0
|
2
|
1625
|
POST
|
Hi Emil, It sounds like to me that there are two separate issues here. The first problem is that the refresh rate settings aren't displaying on your service, and the second problem is that you cannot visualize the data. For the first problem involving the refresh rate settings, I would double check the steps you followed in the module (specifically page 33). The tutorial discusses copying the FeatureServer REST URL and not the MapServer REST URL. From what I have observed, the refresh rate property isn't applicable to the map service (since it is returning a Map Export, as opposed to a query). I'd be willing to guess that you had accidentally copied the MapServer REST URL. To demonstrate this, here is what I saw when I added the map service to my web map (notice the nested layer and how it is similar to your screenshot): Likewise, here is what is available when the feature service is added to the web map. This matches the tutorial. To really address the second problem involving the failed visualization, we'll need more information about your configuration to truly say what the source of the problem is. Without knowing more about what you're specifically seeing, there's a few general suggestions I can offer. For example, does your outbound GeoEvent Definition schema match the schema of the published feature service? Are you witnessing event records come inbound from the GeoEvent Simulator over tcp? Is your GeoEvent Server processing the event records inbound and outbound? Are you able to see the same data that is going to your Flights feature service but within the GeoEvent Logger? If data is in fact being sent to your Flights feature service, have you tried querying the features at the REST endpoint to see if there's valid geometry? These are just a few things I might suggest checking in order to move forward again. If you have any further information about what you're seeing happen (& vice-versa), then please feel free to share so that the community here may better aid you. Likewise, Esri Support Services is always available to assist with the tutorial materials should you like to have a conversation.
... View more
07-03-2019
08:57 AM
|
0
|
0
|
349
|
POST
|
It was great meeting other individuals who were new in their GIS career. Getting to learn about what their goals and interests were (as related to GIS) helped expose me to aspects of the industry and most importantly develop professional connections. The atmosphere of the evening social (during the International User's Conference) made it a very relaxed experience.
... View more
07-03-2019
07:35 AM
|
3
|
0
|
1778
|
POST
|
Hello Simon, I know I am a bit late to the discussion here but I wanted to provide a response in case someone else comes across this discussion. The issue with the 406 error looks to be due to the acceptable MIME type included in the HTTP request header. By default, the "Poll An External Website for XML" input connector defines the default acceptable MIME type as "application/xml". This MIME type isn't always defined on the web server and therefore isn't acceptable for (hence the 406). As a result, it is up to us to change the acceptable MIME type to something the external web server recognizes in order to issue a 'successful' request (I emphasize successful since the request is in fact valid.. it's just not what the web server wants or recognizes). With external XML sources, I have had success with changing the acceptable MIME type to text/xml, application/xhtml+xml, etc. I would recommend reviewing the MIME types available for XML and changing the acceptable MIME type in ArcGIS GeoEvent Server from application/xml as needed. In situations where we have direct access to the web server hosting the XML, a different solution to this problem would be to define the MIME type on the web server to use application/xml (ArcGIS GeoEvent Server's default). In Microsoft IIS for example (installed with the ArcGIS Web Adaptor), we can see that the default MIME type for .xml extensions is text/xml. If we wanted to, we could change the default MIME type in Microsoft IIS from text/xml to application/xml for .xml extensions. With this change in place, we wouldn't need to adjust the acceptable MIME type default in ArcGIS GeoEvent Server (however we could if we wanted to hence the previous paragraph). The key takeaway here is to pay close attention to the MIME types being used. Sometimes what ArcGIS GeoEvent Server uses by default isn't what the web server recognizes and vice versa. Hope this helps!
... View more
06-25-2019
05:58 AM
|
0
|
0
|
1786
|
BLOG
|
For the 10.7.1 release of ArcGIS GeoEvent Server, we are excited to announce new documentation for the existing out-of-the-box input and output connectors. A separate documentation page has been provided for each connector that includes a summary, unique usage notes, list of properties help, and known limitations. To access this content, you are welcome to visit the existing Available input connectors and Available output connector landing pages where you'll notice that the 10.7 version of the documentation includes links for each of the existing connectors in place of the original text-based list. Clicking on any of these links will bring you to the new documentation for the specified connector. Additionally, you can view the new material as a list by accessing the Input connectors and Output connector topics under Connect to Data and Send Updates and Alerts. As mentioned before, the new documentation for each input and output connector includes unique usage notes. These usage notes are intended to help provide additional information about each connector. You'll find information regarding best practices, tips-and-tricks, expected behavior, references to additional documentation, and configuration considerations. Below the usage notes for each input and output connector are a complete list of available parameters. It is worth noting that the parameters in the list include all of those which are shown by default as well as those which are hidden since they are "conditional" (or dependent) on other parameters being configured a certain way to then first appear. You'll find that each parameter is paired with a unique description that explains what the parameter is for, what configurable options are available, what the expected input value(s) may be, and in some cases what the default value is. As always, step-by-step documentation on how to configure various input and output connectors can be found in our existing tutorial-based documentation here: ArcGIS GeoEvent Server Gallery.
... View more
06-21-2019
08:48 AM
|
1
|
1
|
1277
|
POST
|
Hi Daniel Cota The construct geometry property fields are entirely dependent on the GeoEvent Definition chosen by the user. Let me explain through a few scenarios. By default, the "Poll an External Website for JSON" input connector has the default "Create GeoEvent Definition" property set to Yes. This means that the user is prompted to provide a name for what will be their eventual GeoEvent Definition. Technically, the fields do not yet exist since the definition itself does not yet exist. Should the user also choose to 'construct geometry from fields' from what is otherwise a non-existent GeoEvent Definition in this hypothetical situation, they are only going to be presented with fields/tags from 1) an already existing GeoEvent Definition that is 2) alphabetically the first in the list of already existing GeoEvent Definitions. In a scenario where a user is working with a fresh install of ArcGIS GeoEvent Server, this would be the "Incident" GeoEvent Definition (& its fields/tags) since it precedes the TrackGap GeoEvent Definition also installed with the product (I before T). In another example, lets say the user already has several GeoEvent Definitions made however they still choose enable "Create GeoEvent Definition" and "Construct Geometry from Fields". If there's already an existing GeoEvent Definition under the name "Airplanes", then by default you're only going to see the fields/tags from the Airplanes GeoEvent Definition since it is first alphabetically. If you wanted to tinker with this to get a clearer idea of what I am talking about, you can set "Create GeoEvent Definition" to No, select a different existing GeoEvent Definition (e.g. anything that isn't Incident), set "Create GeoEvent Definition" back to yes, and then observe what fields are made available for the Construct Geometry From Fields sub-properties (X, Y, Z). While this is certainly an odd exercise, I share it with you to illustrate the fact that "Create GeoEvent Definition" (set to Yes) in unison with "Construct Geometry From Fields" (set to Yes) makes the most sense when you provide the typed out field names based on the expected inbound schema. For example, I might poll https://api.wheretheiss.at/v1/satellites/25544, set the "Create GeoEvent Definition" to "Yes", set "Construct Geometry From Fields", to "Yes", and then for X enter "longitude" (no quotes), and for Y "latitude" (no quotes). Or to make things more simple, I would have GeoEvent Server poll the data, create the initial GeoEvent Definition, and then go back into the input settings here to have it construct geometry from the fields of the GeoEvent Definition it just created and understands (granted you might need to change the name of the fields). Where I might take more advantage of the drop down fields/tags is when I set "Create GeoEvent Definition" to No, select the correct GeoEvent Definition for the inbound records, and then choose the appropriate X, Y, and Z fields associated with the GeoEvent Definition I had specified. In no scenario are you going to see the X, Y, and Z geometry fields show all possible fields from every single GeoEvent Definition. Lastly, I should add that only certain field data types are going to show when constructing geometry. For example, a field with a data type of "date" isn't going to be an option since coordinates cannot be expressed as dates. While I hope this helps clarify things, let me know if you have any further questions.
... View more
05-20-2019
01:39 PM
|
2
|
0
|
371
|
POST
|
Hello Gregg, If I am understand correctly, working through an administrative reset had resolved this particular issue for you? Looking back at the previous post of yours, the problem seemed to be because the certificate was incorrectly specified as .pfx.cer within ArcGIS Server. Had this changed? Were you able to confirm that the certificate had changed after working through the administrative reset?
... View more
08-14-2018
09:36 AM
|
0
|
1
|
2289
|
POST
|
Hello Gregg, I'd be curious to know what error in regards to Zookeeper that you were seeing but to comment on the issue at hand, is GeoEvent Server referencing the original self signed certificate? If so, it may be worth taking a look at what certificates are present in the location specified (see below). You can start by comparing these certificates to those found in the ArcGIS Server framework to see if there's any sort of mismatch (...\ProgramFiles\ArcGIS\Server\framework\etc\certificates). D:\arcgisserver\config-store\machines\<machine-ip-removed>\<machine.certificate.com.removed>.pfx.cer More importantly, I think that the type of advice I can give you is largely contingent on the version of ArcGIS GeoEvent Server that you are working with. There were quite a few changes to Zookeeper over the last several versions so that could play into how this issue gets treated. Furthermore, some of the information you omitted in the logs could be useful for more nuanced troubleshooting. If you are open to doing so, I invite you to log a case with Esri Support Services so that you and I can investigate this issue. Should you choose to go this route, just be sure to reference me in the case request so that it can make its way to me. I'm happy to be of help.
... View more
08-08-2018
01:01 PM
|
0
|
6
|
2289
|
POST
|
Hi Martin, Other user's have also encountered this behavior. It has been logged as BUG-000101818 "The attribute fields are hidden in Collector for ArcGIS when the fields are set to edit only in the pop-up configuration". A programmer has been assigned to this issue and they are currently working on integrating the fix to this issue in a future Collector app release. Please keep in mind however that there are several factors such as time constraints, project scope, and risk of implementation that may cause this bug to be deferred until a later release then the next. If you would like to be attached to this bug to track its progress, please contact Esri Support (Esri Support Contact Tech Support).
... View more
01-13-2017
07:06 AM
|
2
|
0
|
488
|
Title | Kudos | Posted |
---|---|---|
1 | 02-01-2024 06:53 AM | |
1 | 02-22-2024 01:48 PM | |
1 | 02-01-2024 05:33 AM | |
4 | 11-09-2023 05:32 AM | |
1 | 02-13-2020 03:02 PM |
Online Status |
Offline
|
Date Last Visited |
04-22-2024
07:22 PM
|