|
POST
|
Hello Jay - Please reach out to me off-forum and I will see what I can do to work with you. Then I unregistered the feature dataset as versioned ... I think this is the issue. The add / update feature capability being performed by GeoEvent has never supported versioned geodatabases. Best Regards - RJ
... View more
07-30-2015
04:39 PM
|
1
|
0
|
1537
|
|
POST
|
Looks like your clarification was created as a new thread. Please refer to my reply to your earlier post: In GeoEvent Processor 10.3 does the "Update a Feature" output create new features if a match to the Unique Feature Identifier Field does not exist? - RJ
... View more
07-30-2015
02:20 PM
|
0
|
0
|
592
|
|
BLOG
|
That can be a challenge, when data received from a real-time feed comes with significant delays. There should be nothing to prevent you from including a second input into a GeoEvent Service which you can use to drive your own events into your event processing flow -- either to test the analytics you are developing do what you expect, or in this case, force a processor to re-create a GeoEvent Definition 'just-in-time' for you to get on with your work. I usually use the 'Receive JSON on a REST Endpoint' input to do this. I'll use my Chrome Poster plug-in to HTTP/POST a block of JSON which matches what my actual real-time data feed would normally provide. That way I can "simulate" a single event being sent. Sometimes when my actual real-time input is noisy I'll stop it and use my 'Receive JSON on a REST Endpoint' input until I've completed the changes I want to make to the GeoEvent Service, then restart the actual real-time data feed input. If you are unfamiliar with using a browser plug-in to POST content to a URL, take a look at Module 6 in the GeoEvent product introduction tutorial. By the time I got to spatial processors I had finally grown weary of using the TCP/Text input and the GeoEvent Simulator and began using a the 'Receive JSON on a REST Endpoint' input to POST data to GeoEvent. One trick when using this input: Create the input and save it, then edit it to have GeoEvent Manager reveal to you the URL created and hosted by GeoEvent. This is the URL to which you will want to POST your JSON data. Hope this information helps - RJ
... View more
07-30-2015
02:14 PM
|
0
|
0
|
2083
|
|
POST
|
Hello Jay - My apologies that this reply is coming to you so late - hopefully it is still useful. Yes, the 'Update a Feature' output should default to add a new feature to a target feature class if an existing feature cannot be found whose 'Unique Feature Identifier Field' matches the received event's TRACK_ID. The process at the 10.3 / 10.3.1 product releases goes something like this: An event whose GeoEvent Definition has a field tagged TRACK_ID arrives at an 'Update a Feature' output The output checks an internal cache to see if it has an OID (row identifier) mapped for the event's TRACK_ID If so, the output attempts to use that OID to update the specific feature If it doesn't have an OID, it queries the feature layer to try and discover which feature should be updated If GeoEvent queries for the OID needed to update a feature and discovers that no feature exists in the feature class whose 'Unique Feature Identifier Field' value matches the current event's TRACK_ID ... then it will assume that a new feature needs to be created. Basically we're making requests on three different endpoints of the targeted feature service: .../arcgis/rest/services/service-name/FeatureServer/layer-index/query .../arcgis/rest/services/service-name/FeatureServer/layer-index/addFeatures .../arcgis/rest/services/service-name/FeatureServer/layer-index/updateFeatures You can request GeoEvent log the REST requests it is making by turning DEBUG logging on for the Feature Service outbound transport: com.esri.ges.transport.featureService.FeatureServiceOutboundTransport This will enable you to see the actual JSON included with each REST request and follow the transactions GeoEvent is making with your feature service. You might find additional information in my blog 'Debugging the Add a Feature / Update a Feature Output Connectors' helpful. Best Regards - RJ
... View more
07-30-2015
01:56 PM
|
2
|
0
|
1799
|
|
POST
|
All - Mark Bramer and I presented a technical workshop 'Leveraging Stream Services' at the User Conference in San Diego last week. A draft of the slides from the tech workshop are available now online: Real-Time GIS: Leveraging Stream Services Videos of the technical workshops from the 2015 UC should be available to review and download from Esri's E80 site in a few weeks. Your best bet will probably be to search for the keyword 'GeoEvent' and look for items with the 2015 UC logo: In the workshop I demonstrated how a very simple feed providing the International Space Station's location (http://api.open-notify.org/iss-now.json) could be brought into the GeoEvent Extension and broadcast out a stream service output. Mark discusses some of the technical aspects of stream services and stream layers, and I have a demo where I add a stream service I publish to a web map as a stream layer. It looks like folks on this thread are generally able to access the sample stream services: geoeventsample1.esri.com:6080/arcgis/rest/services geoeventsample3.esri.com:6080/arcgis/rest/services ... ... and are able to click the View In: ArcGIS JavaScript link to see the streaming data displayed using the Java Script API. Are you able to browse to a specific stream service's subscription endpoint (e.g. geoeventsample3.esri.com:6080/arcgis/rest/services/SeattleBus/StreamServer/subscribe), click 'Subscribe' and see the streaming Esri Feature JSON displayed on the HTML page?' This is usually your first test to verify that event data you are processing through the GeoEvent Extension is being broadcast by the stream service. From here it is up to a JavaScript client to subscribe and receive the streaming data. If you are having problems adding a stream service to an ArcGIS Online web map, or a Portal for ArcGIS web map, as a stream layer using a web browser which supports HTML5 WebSockets, your best bet will probably be to contact Esri Technical Support for help identifying the issue. As David suggested, a firewall restriction, restriction related to web sockets, or issues with SSL security certificates can all interfere with a client's attempt to subscribe and receive data being broadcast by a stream service. Thank you David Blanchard Andreas Espersen and Xander Bakker for contributing to the thread. Hope this information helps -- RJ
... View more
07-30-2015
12:56 PM
|
1
|
0
|
2603
|
|
POST
|
Hello Darryl - Sorry, no, there is no inbound connector out-of-the-box which will read the Esri Shapefile format. The input connectors capable of reading system file content available out-of-the-box expect the file to contain either generic JSON, geoJSON, or delimited text (e.g. CSV). If you must retrieve your data from a Shapefile you would need to develop a custom adapter using the product's SDK. It would probably be easier to bring the Shapefile content into a MXD document and publish the content as a either a map service or a feature service ... then use the 'Poll an ArcGIS Server for Features' input to make REST requests on the published service to retrieve its features as event data. Please keep in mind that the GeoEvent Extension generally supports RESTful data streams. There are inputs capable of watching a system folder for files, but these inputs have some significant limitations and are generally intended to prove that real-time analytics you have designed in a GeoEvent Service behave as you intend. Moving toward production we expect real-time data feeds to arrive via HTTP/POST or as replies to queries you make on an external server's URL. Hope this information helps - RJ
... View more
07-29-2015
06:18 PM
|
0
|
0
|
1382
|
|
BLOG
|
Customers, particularly in the Federal Government space, have reported issues launching the ArcGIS GeoEvent Extension for Server when the McAfee Enterprise Suite has been deployed in their environment. The McAffee Enterprise Suite offers anti-virus and malware protection. One component of the suite, the On-Access Scanner, actively scans files used and/or accessed by a running program. This has been shown prevent the proper installation and startup of the GeoEvent Extension. As highly compressed archive files (such as the JAR files utilized by the GeoEvent Extension) are scanned, access to the files is restricted, and multiple timeout failures can occur while waiting for the scans to complete. Please refer to KB Article #44817 on the Esri Support site for additional information.
... View more
07-02-2015
10:50 AM
|
0
|
0
|
1687
|
|
POST
|
Hello Rob - If I understand your question, no, I don't think you will be able to easily configure the GeoEvent Extension (GEx) to discover that two sensors (or in your case two reporting vehicles) have the same identifier. Every event is made up of attributes. Any attribute (String, Date, Double, Geometry, Boolean, etc.) can be tagged as the TRACK_ID. This tells the GeoEvent Extension that the event is associated with a given source (a reporting vehicle or sensor). If a fraudulent reporter or poorly configured sensor were reporting events with a TRACK_ID assigned to another vehicle or sensor ... how would you tell just by looking at two event which one was the impostor? I would be complicated, but in the vehicle tracking scenario, if you assume that a vehicle with a given TRACK_ID cannot be in two places at the same time, you could use a spatial processor to buffer the vehicle's position and send the polygon buffers out a stream service. You could then configure a GeoFence synchronization rule to update the extension's catalog of GeoFences as events are received from the stream service. You would have to be careful to enrich each vehicle's report with a timestamp of when the event was received and calculate a fairly short time range (so that each buffer had a TIME_START and a TIME_END) before it was broadcast out the stream service. That way the GeoFence synchronization would have the information it needed to expire a GeoFence which was out-of-date and the GeoEvent Extension could purge the "old" GeoFences. You then might use a GeoEvent Service to check if a given vehicle report's buffer is DISJOINT with a previous buffer for the same TRACK_ID. Two vehicles reporting on opposite sides of the city would be the trigger you would use to send an event to an alerting output (such as email, or SMS text). This would not be a simple configuration to set-up ... but it is one idea ... off the top of my head. Hope this information is helpful - RJ
... View more
06-24-2015
12:15 PM
|
0
|
0
|
675
|
|
POST
|
Hello Thibaut - Can you provide me with some steps I can use to reproduce the issue you are describing? If I remember correctly, the web socket capability was first introduced with the 10.2.2 product release. It was developed as a proof of concept for the Stream Service capability which was introduced with the 10.3 release. Stream services rely on web sockets behind the scenes. I don't believe that there has been many bug fixes implemented for the initial web socket capability. I know, for example, that there is an outstanding issue that a 'Push JSON to an External WebSocket' outbound connector will not honor its 'Idle time to disconnect' property. I've observed that the output will time out after 600 seconds regardless of what you've configured if no data is broadcast across the web socket connection ... and will not attempt to reconnect. Are you using the 'Receive JSON on a WebSocket' input out-of-the-box ... which runs in SERVER mode? How is your client end of the web socket sending data to the running GeoEvent input? If you can help me identify a way to force the inbound web socket to experience an error, I'll see what I can do to work with a developer and determine if there is something we can do to have the input handle the error and/or attempt to restore its end of the web socket connection if necessary ... but no, there is no configuration you can change that should affect what you are seeing. If the input drops its end of the connection all you can do is manually restart the input connector. - RJ
... View more
06-22-2015
06:59 PM
|
0
|
0
|
937
|
|
POST
|
Hello Ben - Thanks for the question. There is no hard-coded limit or expiry rule for the number of track identifiers a Track Gap Processor will cache. Perhaps there should be. As you suggest, a track identifier that is at some point received will remain in the processor's cache consuming system resources even if that track identifier will never reasonably be seen again. I asked one of the developers for a code review and tested what he told me to verify. It appears that the Track Gap Processor not only caches the identifier for every TRACK_ID it encounters, it also spawns a thread and associates the thread with the cached TRACK_ID. So, at the 10.3 / 10.3.1 release, you can expect that a GeoEvent Service which incorporates a Track Gap Detector will remember every TRACK_ID it encounters and the number of threads associated with the JVM's java.exe process will increase until a system resource is exhausted -- unless you alter the GeoEvent Service and publish your changes (or cycle the GeoEvent Windows Service). Spawning so many threads strikes me as unnecessary and we will probably change the threading model in the future to not spawn so many threads. You likely would not notice, however, unless you were monitoring that particular system resource. If you encounter any problems with the 10.3 / 10.3.1 release, please submit a product enhancement request with Esri Technical Support. That will help ensure that an issue is logged against the product for the development team to address. Hope this information helps - RJ
... View more
06-22-2015
06:20 PM
|
1
|
2
|
814
|
|
BLOG
|
Thanks Thibaut ... yes, I found a bug logged against the 10.3 product release and replied to the thread: Updating GeoEvent Services causes GeoEvent Definitions to disappear - RJ
... View more
06-22-2015
05:29 PM
|
0
|
0
|
2083
|
|
POST
|
Hello Carmen / Thibaut - I did some digging and discovered an issue, logged against the 10.3 product release which appears to match the behavior you describe above. For your reference, the issue is BUG-000087381 in the Esri Technical Support queue. Working through the issue's reproduction steps it is fairly easy to reproduce. I conducted my tests using the 10.3.1 product release. I first create a GeoEvent input to poll a feature service for features, configuring the input to use a GeoEvent Definition I created and own. I then create a GeoEvent output to display event data received and publish a GeoEvent Service with the input and output to verify event flow. I then edit the GeoEvent Service to include a GeoTagger, configuring the GeoTagger to write its information to a field which already exists in the GeoEvent Definition being used by the running input. Interestingly, when I publish the changes to the GeoEvent Service, my output continues to display event data being received by the GeoEvent Service. The displayed event data, however, is not being enriched with the names of GeoFences which satisfy the GeoTagger's configured spatial expression. Checking the GeoEvent's log I see errors resembling the following: com.esri.ges.messaging.jms.MessagingImpl null java.lang.reflect.UndeclaredThrowableException com.esri.ges.core.geoevent.FieldException: Cannot set field on a immutable GeoEvent If I remove the GeoTagger from the GeoEvent Service's event processing flow and republish the GeoEvent Service, the event data which was being displayed by my output stops. A quick check reveals that the GeoEvent Definition which was being used by the input - the GeoEvent Definition I created and own - has been deleted. This is a bug; the GeoTagger should not be deleting a GeoEvent Definition it did not create. The attempted workflow, however, is flawed. A GeoTagger is more like a Field Enricher than a Field Calculator. By that I mean that the GeoTagger is expecting to be able to enrich an event with information: the name and optionally the category of GeoFences which satisfy the GeoTagger's spatial expression. Event enrichment assumes that the field or fields into which the processor will write its information do not already exist. You should expect some sort of exception to be logged by both the GeoTagger and the Field Enricher if they are configured to write information to fields which exist in the event structure received by the processor. We will work to address this issue in an upcoming release of the product, but I would advise for now that you take care to configure a GeoTagger to write to a field which does not exist in the event structure received, guaranteeing that the processor will create a "managed" GeoEvent Definition which the processor owns and has the right to delete. - RJ 10.4 Product Update The issue described above was tested and marked as resolved in the 10.4 product release. 10.4 should be publicly available February 2016.
... View more
06-22-2015
05:12 PM
|
1
|
1
|
3029
|
|
POST
|
Hello Adam - I've received some information on your question I can share. I don't have any first-hand experience with VMware replication, but there are some architecture consultants within Esri Professional Services who do. Your customer service representative could help put together a consulting engagement, if that is something you would like to pursue. Please reach-out to me off-forum and I'll provide you with some names for consultants with experience in this area. Speaking generally, VMotion is something that you would want to integrate within the data center. VMware vCenter's Site Recovery Manager (SRM) is used to move your VMs to the Disaster Recovery site and spin them up. VMotion can create the VMs and SRM can then move these VMs to DR to establish a similar enterprise GIS operational environment. The Esri architecture consultants who responded to me have some documentation for implementing DR, but a lot of the consulting is identifying exactly what you are trying to accomplish. Here are some considerations you may need to address: Data currency and ArcGIS Server site configuration. If you have a primary data center it is relatively easy to maintain a DR site at same operational level with a slightly different server configuration. For example, N+1 fault-tolerant servers are not normally deployed within a disaster recovery site. With ArcGIS Server deployments, changes made to your primary data center can be replicated to your DR. For the most part this is a static set of operations and DR can take advantage of primary data center licensing. Maintaining your Desktop operational environment. This can be a challenge when the data being used is consistently changing. For example, if analysts need the ability to continue data editing and service publication tasks from where they were when the primary site went down, snapshots or replication of changes must also be moved to DR. This can be done during data center operations, but it requires additional licensing. Portal for ArcGIS is another dynamic that needs to be incorporated into the VMotion and SRM environment. You'll have to implement monitoring for facility failure and either manually or automatically switch the Portal configuration to reroute requests from the primary data center to the DR data center. Hope this information helps - RJ
... View more
06-22-2015
12:01 PM
|
1
|
0
|
647
|
|
POST
|
Hello Thibaut - The behavior you are seeing is by design. Please take a look at a blog I just posted: Understanding GeoEvent Definitions. It seems there have been a few questions in this area lately. The GeoTagger processor is one of the processors which will add a field to the event it is processing. Adding or removing fields from an event changes the event's structure, requiring the processor making the change to construct a new GeoEvent Definition. I tend to refer to these as "managed" GeoEvent Definitions since they are owned by the processor and will be deleted when changes are made to the GeoEvent Service and the service is republished. The location of the processor within the GeoEvent Service does not matter. It could be located immediately before an output (as you indicate your GeoTagger is) or further "upstream" closer to the input node. The processor owns the GeoEvent Definition it creates and the processor will delete its GeoEvent Definition if it believes it to be stale and will wait to receive an event before creating a new GeoEvent Definition. Hope this information helps - RJ
... View more
06-05-2015
03:54 PM
|
1
|
1
|
3029
|
|
POST
|
Hello Brad - Yes, if a GeoEvent Definition exists in a product configuration, but a processor feels that it needs to create one, it is possible that you will find what appear to be duplicate GeoEvent Definitions listed by the GeoEvent Manager after event data has been processed by one of your GeoEvent Services. It is also possible that two different processors, in two different GeoEvent Services, have been configured to create a GeoEvent Definition with the same name, producing what appears to be duplicates in the Manager's GeoEvent Definitions list. If you use GeoEvent Manager to select and delete a GeoEvent Definition a processor owns and needs, the processor should recover by re-building the GeoEvent Definition when it next receives an event to process. You might find information identifying which GeoEvent Definitions are associated with which processors in a GeoEvent Service by reading through an XML export of a GeoEvent configuration. Notice in the illustration below that the GeoEvent Service has "nodes", one of which is a processor, and the processor has properties. Usually, however, I just look at the XML to see if it is importing a GeoEvent Definition for me, delete any GeoEvent Definitions I don't think I need after importing the XML, then run a few test events through the GeoEvent Service to see which GeoEvent Definitions get created. As a general rule, I try not to include "managed" GeoEvent Definitions (those created by a processor or inbound connector's adapter) in the XML configurations I export. I try to make copies of event definitions I need and let processors / adapters create fresh event definitions for what they need on-the-fly. Hope this information helps - RJ
... View more
06-05-2015
03:41 PM
|
0
|
0
|
1104
|
| Title | Kudos | Posted |
|---|---|---|
| 1 | 01-05-2023 11:37 AM | |
| 1 | 02-20-2025 03:50 PM | |
| 1 | 08-31-2015 07:23 PM | |
| 1 | 05-01-2024 06:16 PM | |
| 1 | 01-05-2024 02:25 PM |
| Online Status |
Offline
|
| Date Last Visited |
10 hours ago
|