|
POST
|
Learn how to get started with consuming data from Spire's AirSafe Tracking Stream in ArcGIS Velocity using the HTTP Poller feed.
... View more
04-29-2021
10:08 AM
|
1
|
2
|
5075
|
|
POST
|
Hi Arthur, Instead of using URL parameters, you'll want to click on "configure expression" to format your JSON object using Arcade. The only thing, based on your example, that you'll want to have as a URL parameter is the authorization token. Click here to configure your JSON body using Arcade: Specify the following to maintain the JSON object with dynamic field values. Note the resulting JSON when I tested the expression: Lastly, it's important to note that the field "paras_value" exists because I flattened the paras object as part of the feed schema. If you didn't flatten "paras" as part of the feed schema, then its likely your field name and value will be different.
... View more
04-13-2021
05:57 AM
|
1
|
0
|
1400
|
|
POST
|
Hi Arthur, What you're asking is definitely possible with ArcGIS Velocity. To achieve this task, you would want to use the HTTP output to construct a POST request. Simply provide the URL, select the content type (JSON), and add any additional headers (e.g authorization token). For the message body, you'd wrap your JSON object using the Text() Arcade expression so that you can keep the JSON formatted. You can even parameterize the values in your JSON by referencing the fields from your processed data. As an example, here is what I had tested using your sample data. The bold parts indicate what I mentioned about about Arcade/parametrized data: text({ "service_id" : $feature["service_id"], "command_name" : $feature["command_name"], "paras" : { "value" : $feature["paras_value"] } })
... View more
04-05-2021
11:29 AM
|
0
|
2
|
1436
|
|
BLOG
|
ArcGIS Velocity offers the ability to send an email as part of a configurable real-time or big data analytic. This is often useful for when there are events of interest that warrant the need for an immediate notification. For example, you may need to be notified when a vehicle in your fleet has experienced engine trouble and requires assistance right away. Likewise, it could be equally imperative that you are informed about summary statistics derived by your big data analytic after it has finished processing sensor data gathered over the last month. Either way, emails allow you to act upon your data that much faster. The Email output in ArcGIS Velocity allows you to send an email via an externally accessible SMTP server. You can certainly configure your email output with your own organization’s SMTP server if it is externally accessible on the internet. However, if this is not an option, or you do not have an SMTP server of your own, it is worth considering alternatives such as what is offered by Twilio’s SendGrid SMTP service. In this blog, we will explore how you can configure your email output to use SendGrid’s SMTP server as one option for sending emails in ArcGIS Velocity. What is SendGrid and how can I use it? SendGrid is a cloud-based SMTP service provider. A SendGrid account with the free Email API plan can enable you to send up to 100 emails a day via SendGrid’s SMTP server. Paid subscription tiers can support even more emails. But with a few simple steps, the SendGrid SMTP Server can be configured with ArcGIS Velocity to send emails as part of your real-time and big data analytics. Before getting started, it is important to note that an email will be sent for each discrete feature that an instance of the Email output receives in ArcGIS Velocity. For example, if you have configured a feed that ingests 250 events per second, and this feed is used in a real-time analytic just to send email alerts, you could accidentally emit 250 emails every second without any sort of filtering of your data as a precursor. Talk about spam! This high rate of messaging could quickly overload the capacity of your organization’s SMTP server or result in service suspension from your SMTP service provider. Consider the functional design of your analytic, as well as any rate limitations from external SMTP services, before sending emails via Velocity. It may be necessary to consider a paid SMTP service subscription to guarantee a large quantity or rate of emails. However, unless necessity dictates otherwise, the best practice with email outputs is to be conservative with the number of emails generated per second, minute, hour, and so forth. Configure SendGrid for ArcGIS Velocity To use SendGrid for your SMTP Server needs, you will first need a SendGrid account. If you do not have a SendGrid account already, you can create one here. You can configure an account using the free Email API to get started. Consider SendGrid’s terms of service before proceeding. Once you have a SendGrid account, you will then want to create a single sender identity. A sender identity essentially is what represents you as your “from” email address. This is the email address that recipients will see as the sender and is what ArcGIS Velocity will use as the from email address. You can create a single sender identity from the SendGrid dashboard upon signing into your new SendGrid account. You will need to verify your single sender identity by checking the email you had specified. If you are using an email that is part of your organizations domain, you may want to consider reviewing any DMARC policies that are in place as this could interfere with using the email address as your single identity sender. In any case, follow the instructions provided by SendGrid to configure a single sender identity. Create a sender identity after you sign into SendGrid After you specify a single sender identity, you should be ready to send your first email with SendGrid. There are two methods to send an email. The first is to integrate using SendGrid’s Web API or SMTP Relay and the second is to build and send using marketing campaigns. For ArcGIS Velocity, you will want to choose the first method so that you can configure an SMTP [server] relay. Integrate using SendGrid's SMTP relay SendGrid will require that you first create an API key to authenticate with their API and send emails using their SMTP relay. The API Key acts as the password for sending emails, via the SMTP relay, in applications like ArcGIS Velocity. Once you specify a name for and generate your API Key, you will finally be provided details about the SendGrid SMTP server address, ports, username, and password (API key). Consider storing this information in a safe and secure location after it is generated. It is important to reference this information when configuring your email output(s) in Velocity. At this point, you can verify your integration. Store the SMTP address, ports, username, and password Configure an Email output in ArcGIS Velocity Now that you have established all the necessary SendGrid components, you can access ArcGIS Velocity and begin to configure an Email output that uses the SendGrid SMTP server. Navigate to an existing analytic, or create a new one, to begin configuring an email output. When you add an email output to an analytic, you will need to first open it to configure your SMTP settings. Add an email output to an analytic from the outputs folder Step one when configuring an email output is to configure SMTP server options. If you refer to the SendGrid settings you stored from earlier, you will want to specify the SMTP Host property as the SendGrid SMTP Server (often smtp.sendgrid.net). By default, you can use port 465 and leave the SMTP server security property set to SSL/TLS. For the Username property, you will need to specify the username provided by SendGrid (often apikey). Lastly, for the Password property, provide your API key that was auto generated earlier. Configure the SMTP Server with the SendGrid settings At this point, you should be ready to proceed to step 2 where you can configure your email fields. The From email address should be defined as the email you provided as your single sender identity. The remaining properties, To email addresses, Subject, and Message body can all be specified however you wish. You can either specify manual values, as I had done in the case of my Subject and Message body below, or you can reference fields and their values from your data as I had done using my “email_recipient” field for the To email address property below. Configure email fields with manual entry or data values Once you click Complete, you should be good to go with sending emails as part of your real-time or big data analytic. It is important to remember that the free Email API that SendGrid offers has a cap of 100 emails a day. If you require more than that, it may be worth exploring an alternative subscription plan. Sending emails is just one of many ways to notify others of critical events. If you are interested in seeing what other outputs are available for sending notifications, or processing data in general, feel free to visit and explore our resource on outputs in ArcGIS Velocity.
... View more
01-19-2021
01:33 PM
|
2
|
0
|
2975
|
|
POST
|
Hi Saurabh, Most often at a minimum, all you need to utilize ArcGIS GeoEvent Server is an installation of ArcGIS Server (licensed as GeoEvent), and then an installation of GeoEvent Server on the same machine. This type of set up should be sufficient to allow you to read from ArcGIS Online feature services, and subsequently write to ArcGIS Online feature services. Its not necessary that you install and configure a Portal + other supporting ArcGIS Enterprise elements. Its important to note however that stream services (from GeoEvent Server) cannot be published/hosted in ArcGIS Online. They have to be published to the underlying ArcGIS Server since they run as a part of the GeoEvent Server service (to really generalize things here). You can certainly add the stream services to your ArcGIS Online organization as items (just as you would with a non-hosted map/feature service), but the stream service still needs to be published & running on the GeoEvent Server machine. If you are primarily working with ArcGIS Online already, you don't have a need for an on-premise Portal (or ArcGIS Enterprise for that matter), nor do you want to deal with the infrastructure aspects of managing GeoEvent Server, we do offer a new product called ArcGIS Velocity which could likely fit your needs. Velocity is a new capability of ArcGIS Online that provide real-time and big data processing & analytic capabilities. With it, you can ingest real-time data, perform on-the-fly analytics, and even output it to feature/stream services hosted in ArcGIS Online (and more). Since ArcGIS Velocity is a SaaS, there's nothing for you to manage in terms of infrastructure. Everything is hosted in ArcGIS Online. Something worth considering given your questions! Hope this helps!
... View more
10-22-2020
04:45 AM
|
2
|
0
|
1892
|
|
BLOG
|
Have you ever had to employ a file-based output connector such as Write to CSV File to better understand and troubleshoot what is happening with your real-time data as it's processed in a GeoEvent Service? Or rather, have you ever had to publish a pseudo stream service or feature service just to verify the geometry of your event data is as expected? What happens when you forget to configure an output connector while creating a new GeoEvent Service? What do you do when you realize you need to edit an input in a GeoEvent Service before you're ready to publish it? If you're like me, these are just a few hurdles you've encountered when working with different elements in the service designer of ArcGIS GeoEvent Manager. It can take time, patience, and a certain level of familiarity to be successful. We've recognized how many steps you must go through to simply and quickly determine what's happening with your real-time data as it's processed in a GeoEvent Service. Additionally, having to navigate to the multiple pages in GeoEvent Manager to create new or edit existing inputs, outputs, site settings, and more can take time. Given these realities, i'm excited to announce many new usability enhancements as well as a new sampling utility available in the service designer at 10.8. These new capabilities will help you be more efficient and effective when defining your real-time services, making the hurdles mentioned above a thing of the past. So, let's explore some of these new capabilities in more detail! Working with elements and settings in the service designer First, we’ve made some exciting functionality and usability enhancements to the service designer in GeoEvent Manager. Many of you are probably familiar with the New Element list where you could previously only add new filters and processors to a GeoEvent Service. With 10.8, you now have the ability to create new inputs and outputs as well as copy existing inputs/outputs and configure them directly in the service designer. So if you're in a GeoEvent Service, and you forget to create an input or output connector, there's no longer a need to leave the current GeoEvent Service or open another browser tab just to create it. Simply add an input element from the New Element list to the canvas, configure the properties, and save it. It's the same workflow you've become familiar with when adding and configuring both filters and processors. You may be thinking, “what do I do if I need to edit an input or start or stop that input?” Well, there's now the ability to start, stop, and even edit existing connectors from inside the service designer. Simply right click an input/output you've added to the canvas or one that exists in the element list to access the action menu. For example, in the illustration below, by right-clicking the Receive Flights Data input, you can choose to edit the input's properties, delete the input, fit the input's bounding box to the text, as well as start or stop the input to control the flow of data. By choosing to edit the input properties, a new window will appear allowing you access to all the familiar properties of the input. Should you choose to stop the input, a new status icon on the element will change from green to gray to reflect the stopped state. Quickly edit, delete, start, and stop an input or output by double clicking the element in the canvas. While inputs and outputs connectors are certainly important, they're only half the story when it comes to publishing a working GeoEvent Service. There's also several GeoEvent Server site settings that need to be considered and set appropriately, oftentimes, before successfully publishing a GeoEvent Service. That leads to our next exciting enhancement, the ability to access and edit several key GeoEvent Server site settings directly in the service designer. These settings include access to your GeoEvent Definitions, GeoFences, Data Stores, and Spatiotemporal Big Data Stores. Have you ever realized halfway through configuring a spatial filter that you forgot to configure your GeoFences? Or, maybe you forgot to create the target GeoEvent Definition for a Field Mapper Processor? Rather than leave your GeoEvent Service, and potentially lose the work you put into configuring it, simply access those settings directly in the service designer. From the Site Settings list, you can double click any of the settings available to open and access those particular settings. When you're finished making any necessary updates, just save your changes and continue configuring your GeoEvent Service. Sample and view real-time data In addition to the above enhancements to the service designer, i'd like to next introduce you to our newest utility in ArcGIS GeoEvent Server 10.8; the GeoEvent Sampler. As the first of its kind, the GeoEvent Sampler is an embedded utility in the GeoEvent Manager's service designer that allows you to quickly sample, review, and even visualize processed data in real-time as it flows through routes and elements in a GeoEvent Service. No longer is it necessary to spend time and effort configuring different types of ephemeral outputs just to review, visualize, or verify your processed data is as expected. Unlike GeoEvent Logger or GeoEvent Simulator, which are separate Windows applications, GeoEvent Sampler is embedded in GeoEvent Manager. Therefore, you'll be able to use this new utility in both Windows and Linux environments. Let’s explore how the GeoEvent Sampler can be used to help you build and/or troubleshoot a GeoEvent Service. Verifying your schema Let’s say you want to ensure your Field Mapper Processor is altering the schema of your processed event data correctly. Prior to 10.8, you could write the data emitted from the Field Mapper Processor to an external JSON file. This meant first creating and configuring a separate Write to a JSON File Output Connector. Next, you would need to add that output to your existing GeoEvent Service. Once this was set up, and your outbound data was writing to a JSON file, you would then need to open the JSON file in a text editor to review the data as formatted JSON. While this workflow isn't necessarily difficult to accomplish, it can be time consuming. Using the new GeoEvent Sampler at 10.8, you can simply select the route that connects the Field Mapper Processor to your next element (e.g. an Output Connector as shown in the illustration below) and sample the event records (formatted in JSON) that are being emitted in real-time on that route. A sampled event record allows you to confirm the definition and schema are correct. By never having to leave the service designer, you can quickly confirm whether or not the schema is being updated as expected. If you happen to notice that one of your target definition fields is spelled incorrectly, or that the data type of a field is a string instead of an integer, simply edit the target GeoEvent Definition without leaving the service designer. Remember, GeoEvent Definitions can now be edited directly in the service designer at 10.8 as mentioned above. After making your edits, want to double check that the changes are correct? Just publish the GeoEvent Service and sample the event data on that route again! Another useful scenario where GeoEvent Sampler could come in handy is if you wanted to compare event data on two routes in a GeoEvent Service. For instance, if you wanted to compare your event data's original schema to the schema after it's emitted from a Field Mapper Processor. First, select and sample the first route that's sending the data into the Field Mapper Processor. This will sample the data right before its schema is transformed by the processor. Next, select and sample the route that's emitting data from the Field Mapper Processor for comparison. This will sample the data after the schema as been transformed. With just a few clicks, you can see the data changing in front of you, in real-time. Comparison sampling confirms the source and target fields are mapped correctly. For example, the MPH field (left) was renamed to Speed (right). Verify attribute values GeoEvent Sampler can also be used to verify the attribute values of your real-time data. Continuing with the flight data example in the illustration above, there is a new field called Speed whose value is in miles per hour (e.g. 500 mph). Let's say you want to change the mph value to kilometers per hour (kph). To do this, you need to use a Field Calculator Processor to multiply the mileage value in the Speed field by 1.609344 to get the kilometers per hour (i.e. Speed * 1.609344). While this is easy enough, how can we quickly verify the conversion is happening before configuring the rest of the GeoEvent Service? Prior to 10.8, one option would be to create and configure a Push Text to an External TCP Socket Output Connector and view the output in GeoEvent Logger. Data from before and after the Field Calculator Processor could be routed to the output and viewed in GeoEvent Logger. While this is relatively simple to set up, it does take time, just like the previous JSON file example above. Something else worth considering with this type of workflow is data velocity. Depending on the rate at which your real-time data is being received and processed, it could be a challenge to review the data in GeoEvent Logger since it could be constantly updating with new information. You could close the TCP connection in order to allow you to review the data in GeoEvent Logger, but that's just another factor to consider. Now at 10.8, you can use GeoEvent Sampler and select the route before the Field Calculator Processor to observe the speed data in miles per hour (mph) and then select the route after the processor to view the speed data after its been converted to kilometers per hour (kph). It's important to note that GeoEvent Sampler is not a logging utility, you can only sample a fixed number of GeoEvents on a route (1, 10, or 100 at a time). So, in the case of the flights example, you could quickly sample a single GeoEvent on each route to verify the processor is correctly calculating the speed in kph. There's no need to comb through hundreds of processed GeoEvents to verify the same thing. The graphic below illustrates this. Comparison sampling of two non-adjacent routes confirm the original MPH field (left) and it's value of 502.0 was correctly converted to kph in the Speed field (right) with a new value of 807.89. Verify geometry So far we've covered how GeoEvent Sampler can be used to review and validate your schema and attributes, but what about the geometry of your processed data? Well, included with GeoEvent Sampler is a capability called the Event Viewer. It can be used to display the geometry of processed event data whose GeoEvent Definition geometry field has been tagged with GEOMETRY. Let's say you're tracking airplanes and using a Buffer Creator Processor to create a buffer for each airplane. You want to ensure the airplanes are being buffered by the correct distance before proceeding with the configuration of the GeoEvent Service. In this case, you need to make sure the point geometry is being changed to polygon geometry. Prior to 10.8, you could verify this a few different ways. First, you could write the buffered GeoEvent to a JSON file and review the geometry object for rings. But even then, how do you know that this geometry will be displayed correctly in a feature or stream service? Unless you're an avid fan of deciphering JSON syntax, you'll likely try to send the data to a feature or stream service to see if it takes. After all, if you can see the geometry of the data display in a web map, you can be confident that buffering is happening, and that the geometry is what you expect. Of course, checking the geometry of a GeoEvent by using a temporary feature or stream service first means creating and configuring those services. Before that can be done, there's GeoEvent Definitions, data store connections, and other factors to consider. Using GeoEvent Sampler at 10.8, you can now sample the GeoEvents emitted from the Buffer Creator Processor. Once you have a sampled GeoEvent, you can then simply open the Event Viewer to display the geometry. If you choose to sample a single route, the Event Viewer will display the geometry of the sampled records from that route. If you choose to sample two routes for comparison, the Event Viewer will display two map views that show the geometry of the sampled data from both routes respectively. In the illustration below, you can see the point geometry of the source flight data and the polygon geometry of the buffered data output from the Buffer Creator Processor. The Event Viewer in GeoEvent Sampler, allows you to confirm the valid point geometry of the source event data and the valid polygon geometry of the buffered event data. These are just a couple ways GeoEvent Sampler can be used to assist with the creation or troubleshooting of your GeoEvent Services. Other example use cases of GeoEvent Sampler not covered here include (but are not limited to) checking datetime values, filtering, regular expressions, construction point geometries, and more. Hope you enjoy using these new enhancements! If you have ideas for future enhancements, please submit those on the Real-Time GIS place on ArcGIS Ideas.
... View more
02-20-2020
10:38 AM
|
4
|
0
|
1602
|
|
BLOG
|
Back in June of 2019, at the 10.7.1 release of ArcGIS GeoEvent Server, we shared the news of new documentation for the available out-of-the-box input and output connectors. Today, we are happy to announce new and expanded documentation for each of the available processors in GeoEvent Server 10.8. Like the connector documentation, each processor now has expanded information to help you properly configure the processor. Access the new processor documentation by visiting What are Processors? where you can access links to each individual processor you may be interested in. Alternatively, you can access the new processor help directly in ArcGIS GeoEvent Manager when editing or configuring a new processor. Simply click the Help button in the processor property dialog to expand the embedded help. It's important to note that even though this new documentation is available with the 10.8 release, the information is still applicable to previous versions as well. The new processor documentation follows the same format and style as the input and output connector documentation that you may already be familiar with. First, you’ll see a general summary as well as some example use cases for each processor. Beneath that you'll find detailed usage notes. These notes are intended to provide some extra contextual information for successful configuration of each processor. Following the usage notes, is a parameters table that includes detailed information about the parameters for each processor, what each parameter does, options to configure it, and more. The parameter table list all of the available parameters, meaning you can learn about all the parameters that are shown by default in additional to all of the conditional parameters that are initially hidden due to their dependencies on other parameters. We recognize that many of the available processors have their own nuances and quirks, so we’ve included a final section that details various considerations and limitations. This information is intended to provide some additional context about how the processor fundamentally works, and therefore, will hopefully help guide how you approach incorporating it into your GeoEvent Services. As with our input and output connectors, you can find step-by-step tutorials on how to configure many of the available processors on the ArcGIS GeoEvent Server Gallery.
... View more
02-13-2020
03:02 PM
|
1
|
0
|
729
|
|
POST
|
Hi James, You're welcome. To answer your question about whether filters can cause a service to slow down, the short answer is yes* (* with some nuances to consider). Any time that you add filter or processor elements to a GeoEvent Service, you are effectively introducing additional processing overhead cost. Each element requires a varying amount of system resources in order to perform its designated work on the inbound real-time data. When we start to take into account the rate, schema, and size of the data (as well as other external factors such as network latency and system resources), this type of discussion can very quickly become more nuanced. I don't bring this information up to give the false impression that a single filter tanks the performance of a GeoEvent Service. That's not true. Rather, the takeaway here should be that all of these variables can have varying effects on the total throughput of a GeoEvent Service (and by extension, a GeoEvent Server site overall). While it's certainly best to consider the above points when building out a GeoEvent Service, I don't think its directly related to the problem you described. What strikes me as interesting is what you said about how the behavior improves when three of your four processors (I read this as routes) are disabled. This tells me that there might be an issue with the design of your GeoEvent Service as it relates to event record throughput. It's important to consider how GeoEvent Server handles event records when there's more than one element in the service to send data to. Take your GeoEvent Service for example. You have one input connector (named "ENRICHED_JSON") that is configured to send data down four separate paths to four different elements (PDS Filter, SUMMARY_FILTER, filter, and the All Message Mapper). If your data is coming in at a rate of 1000 events per second for example, GeoEvent Server is actually taking a copy of those 1000 records and is sending them to each of the 4 elements. At any given point in time, this actually works out to 4000 events being processed per second since the data that is coming in is replicated for each route. The filters might be discarding a lot of the data downstream, but that doesn't change the initial pressure at the start of the service. It makes sense to think that the bottom route is handling 'the most' data, but the truth of the matter is that each of the first four elements is handling the same amount of data (they're just doing different things with it). Its the 4th element that is passing the most data through. If you're interested in learning more about GeoEvent Server best practices, ideal service design, performance, etc, I would recommend that you check out some of our best practice conference session recordings (example). If you'd like to send me a direct message, perhaps we can set up some time to discuss this in more detail. I still have some questions about the behavior you're seeing happen, the rate of your data, what the end-goal is as far as writes to the feature service are concerned, etc. I don't want to mislead you with any particular advice if the issue is in fact related to something else entirely that wasn't covered here. Let me know.
... View more
12-18-2019
08:12 AM
|
0
|
0
|
1903
|
|
POST
|
Hi James, To help clarify, are you looking to treat all objects in your JSON array as a single record (analogous to a single row in a database table), or are you looking to simply 'queue' individual records (objects in the array) in such a way that any adds/edits to your feature service is done in 'bulk'? You had asked: I would like to know if there is a way to treat all of the array elements as a single event which in turn would lead to a single transaction. This would imply that you want to merge all array elements (or objects) so that they are a single event record (i.e. row in a database table). Likewise you had also asked: Again, I would like to insert multiple rows in a single transaction using the add feature output connector. This would imply that you want to maintain each array element as its own respective event record. Rather then adding these one-by-one in real-time, you would like to add several in bulk. No? I assume you are asking about how to achieve the latter scenario, but I would like to be completely sure since both questions entail entirely different responses. If I am to assume the latter, you might want to look into the Add a Feature output connector properties, "Update Interval (seconds)" and, as you've pointed out already, "Maximum Features Per Transaction". The "Update Interval (seconds)" parameter is important here since it can be used to specify how frequently the output connector will essentially queue processed event records, make a request to the feature service of interest to add those processed event records as new feature records, and then finally flush its cache to make room for additional processed event records. By default, this property is set to 1 second but if you wanted to, you could adjust this to be something of your choosing like 2 minutes (120 seconds). Using 120 seconds as an example, GeoEvent Server will queue all processed event records (from your input connector) for 2 minutes. Once the 2 minute window has lapsed, GeoEvent Server will make a request to the feature service to add its queued event records as new feature records. GeoEvent Server will then flush its cache to make room for the next set of processed event records only to repeat the same process 120 seconds later. The important thing to keep in mind with this property is that it works using an iterating time interval, not record count (!= 'add every time 200 event records have been queued'). If you know that your data is coming into GeoEvent Server at a consistent rate (e.g. 20 event records per second), you could hypothetically adjust this property to match what you're looking for in respect to a certain 'transaction size'. Of near equal importance here is the "Maximum Features Per Transaction" property. This property controls for how many records to include in any single feature service request to add new feature records. By default this is set to 500 which means that each add request will contain a maximum of 500 records. If we go back to the above example involving a 120 second update interval, let's say you've queued 1800 records over that period of time. This property will ensure that 4 transactions are made to the feature service (500 + 500 + 500 + 300 = 1800) to add new feature records. It's important to note that this property doesn't provide inhibit the number of requests made (!= 'only make 2 requests at the maximum transaction size regardless of how much data there is still'). You might be thinking that this property isn't very useful for your scenario, and that might be true, but I thought it is important to share in case you need the requests being made to your feature service to generally contain a certain record count (and nothing more). Realistically, this property was intended to help reduce the burden on an external server/service by providing transaction management. If you're looking to find out more information regarding the former scenario, please feel free to let me know.
... View more
12-12-2019
07:02 AM
|
0
|
2
|
1903
|
|
POST
|
Hi khadija F To confirm whether or not the input connector is working as it should, we would likely want to see how your data is formatted in the CSV. That'll help confirm the validity of some minor settings you've configured. That aside, what immediately stands out to me about your configuration is your usage of the Input Directory property. You probably want to either leave that blank or change it altogether if you haven't tried doing so already. More on why below. When working with the "Watch a Folder for New CSV Files" input connector, there are really two properties that tell GeoEvent where to find the CSV. The first is the Input Folder Data Store. The Input Folder Data Store is essentially a path to a folder location that has been registered with GeoEvent Server. An example of a registered data store folder is the "Automatic Backups" folder included with the installation of GeoEvent Server. "Automatic Backups" points to the path C:\ProgramData\Esri\GeoEvent. The second property is Input Directory (optional). The Input Directory property tells GeoEvent Server where it can look relative to the specified Input Folder Data Store path. In the case of your configuration, you specified the Input Folder Data Store path as "IVMS". For example purposes, lets say the corresponding path for "IVMS" is actually E:\TestFolder (just as "Automatic Backups" points to "C:\ProgramData\Esri\GeoEvent"). At the same time, you've also configured the Input Directory property to point to "C:\GEOEVENT\INPUT. This is the location relative to E:\TestFolder that you've specified. Considering the two factors above, you are essentially telling GeoEvent Server to look at E:\TestFolder\C:\GEOEVENT\INPUT for your CSV. I could be mistaken here, but I doubt that path exists (or is the path you wanted to specify). This is the likely reason why nothing is coming through. You should only specify an Input Directory when the registered folder data store isn't the exact location of your file that you want GeoEvent Server to read from. I often specify the exact folder location of my content (as a registered folder) and therefore don't need to specify an Input Directory path. However, its worth mentioning that the property exists so that if somebody has a registered folder data store set to "C:\GeoEvent" for example, they could potentially configure several other input connectors to point to different sub-folder locations underneath C:\GeoEvent without having to register just as many folders. For example, lets say I have one folder for CSV data (called CSV_Folder) and another folder for JSON data (called JSON_Folder) under my hypothetical registered C:\GeoEvent data store folder. If I were configuring a new "Watch a Folder for New CSV Files" input connector, I could configure my Input Directory as CSV_Folder relative to the registered C:\GeoEvent data store folder (i.e. GeoEvent would look at C:\GeoEvent\CSV_Folder for CSV files). Likewise, I could configure a separate "Watch a Folder for New JSON Files" input connector to use JSON_Folder as its Input Directory relative to the registered C:\GeoEvent data store folder (i.e. GeoEvent would look at C:\GeoEvent\JSON_Folder for JSON files). TLDR: To simplify things a bit, I suggest creating a new registered folder (Input Folder Data Store) that points to the exact path of where your CSV resides. Once you have done that, I would keep the Input Directory property empty. You may need to let GeoEvent Server create a GeoEvent Definition initially to make sure that the data can be brought in once the path is correct.
... View more
10-14-2019
05:51 AM
|
1
|
0
|
1515
|
|
POST
|
Hi Patricia, One thing you could check is the mime type included as part of the polling request. Is there a specific mime type used by your API that isn't either of the defaults included with the "Poll an External Website for JSON" input connector? You could monitor for the response from the API request in the browser to help with determining this. Normally I would expect a status of 415 for this to be the issue, but I am not sure what the rest of your logs say. Nevertheless, I thought I would put this out there based on some of my experience with this type of behavior. If you haven't done so already, I might also suggest trying to enter the entire request URL in the URL parameter as opposed to segmenting it via the other parameter properties. This helps reduce the likelihood of an incorrect URL being provided (this is unlikely). Ideally we would want to take a closer look at the rest of your logs to see what exactly may be going on here. It might be worth reaching out to Esri Support Services to assist with this if no other suggestions are of immediate benefit. Kind regards Gregory
... View more
10-08-2019
05:37 AM
|
2
|
2
|
5771
|
|
POST
|
Hello Eric, You're welcome. I am glad that I was able to help pull the blinders off. Feel free to reach out if you have any further questions about this particular problem. I am happy to be of help.
... View more
10-07-2019
09:39 AM
|
0
|
0
|
2728
|
|
POST
|
There's some details about your environment that we would likely want to clarify, but generally speaking you will want to use GeoEvent Server to apply real-time adds or updates to a feature service whose underlying data source references tables in your SQL database (more or less). GeoEvent Server itself doesn't make a direct connection to any sort of databases. Rather, it uses user-configured server connections (ArcGIS Online, ArcGIS Enterprise, ArcGIS Server), and the map/feature services available from those connections, to facilitate adds/updates to the underlying databases registered with those 'servers' (e.g. the ArcGIS Data Store as a managed database, SQL Server as a database, Oracle as a database, hosted in ArcGIS Online, etc). In brief, adding, removing, or updating data is handled through the feature services which expose the underlying data in a database. I am not familiar with the ins and outs of your environment, but generally speaking I could see a hypothetical workflow here where you create a registered server connection in GeoEvent that points to your ArcGIS Enterprise. From there, you can publish feature services (or use existing feature services) which reference some sort of enterprise geodatabase, like SQL in your case, that has already been configured with your ArcGIS Enterprise's hosting server. As real-time information comes into GeoEvent Server about your snow plows, you can set up GeoEvent Services to perform analysis on that data and then use that new information to apply updates to the feature service you published (which is essentially pointing back to a table in SQL). One thing you mentioned was creating spatial views between the event features and the street line in SQL to then publish a street service symbolized by treatment time. While this is certainly a possibility, it is worth pointing out that this type of problem can be solved directly within a GeoEvent Service using its analytic capabilities. For example, it wouldn't be a far fetched idea to regularly poll buffered street data (lets say every 5 seconds) as the event features. Snow plows could be brought into the same GeoEvent Service as dynamic point geofences. Using a spatial filter, the buffered street segments could be evaluated every 5 seconds to see if they "contain" any snow plow. Assuming a street segment "contains" a snowplow, we can configure the next process of our GeoEvent Service to update the status of that street to "recently cleared in the last..." since the intersection occurred with the snow plow. The output of this hypothetical service would be your streets data with an updated treatment status attribute in real-time. If you were using this output to update a feature service, and that feature service was already in a web map, you could visualize the real-time treatment status of your roads based on a symbology that uses the changing field. We can fine tune other aspects of our analysis to account for time, road segmentation, whether or not the plow has its arm down, etc, but I wanted to throw this out there to demonstrate one of many potential approaches to this problem. Some of this might sound like jargon if you're new to GeoEvent Server and that's okay. If you are new to GeoEvent Server, I would recommend working through the introductory tutorial series found here. It'll give you a better idea of how GeoEvent Server works and how it fits into the broader ArcGIS Enterprise.
... View more
09-11-2019
07:51 AM
|
0
|
2
|
2728
|
|
POST
|
Hi Emil, It sounds like to me that there are two separate issues here. The first problem is that the refresh rate settings aren't displaying on your service, and the second problem is that you cannot visualize the data. For the first problem involving the refresh rate settings, I would double check the steps you followed in the module (specifically page 33). The tutorial discusses copying the FeatureServer REST URL and not the MapServer REST URL. From what I have observed, the refresh rate property isn't applicable to the map service (since it is returning a Map Export, as opposed to a query). I'd be willing to guess that you had accidentally copied the MapServer REST URL. To demonstrate this, here is what I saw when I added the map service to my web map (notice the nested layer and how it is similar to your screenshot): Likewise, here is what is available when the feature service is added to the web map. This matches the tutorial. To really address the second problem involving the failed visualization, we'll need more information about your configuration to truly say what the source of the problem is. Without knowing more about what you're specifically seeing, there's a few general suggestions I can offer. For example, does your outbound GeoEvent Definition schema match the schema of the published feature service? Are you witnessing event records come inbound from the GeoEvent Simulator over tcp? Is your GeoEvent Server processing the event records inbound and outbound? Are you able to see the same data that is going to your Flights feature service but within the GeoEvent Logger? If data is in fact being sent to your Flights feature service, have you tried querying the features at the REST endpoint to see if there's valid geometry? These are just a few things I might suggest checking in order to move forward again. If you have any further information about what you're seeing happen (& vice-versa), then please feel free to share so that the community here may better aid you. Likewise, Esri Support Services is always available to assist with the tutorial materials should you like to have a conversation.
... View more
07-03-2019
08:57 AM
|
0
|
0
|
686
|
|
POST
|
It was great meeting other individuals who were new in their GIS career. Getting to learn about what their goals and interests were (as related to GIS) helped expose me to aspects of the industry and most importantly develop professional connections. The atmosphere of the evening social (during the International User's Conference) made it a very relaxed experience.
... View more
07-03-2019
07:35 AM
|
3
|
0
|
2945
|
| Title | Kudos | Posted |
|---|---|---|
| 1 | 06-20-2025 06:24 AM | |
| 1 | 01-31-2024 01:21 PM | |
| 2 | 09-19-2024 10:53 AM | |
| 1 | 02-01-2024 06:53 AM | |
| 1 | 02-22-2024 01:48 PM |
| Online Status |
Offline
|
| Date Last Visited |
06-20-2025
04:49 AM
|