|
POST
|
Hey Sharon - Are you able to log-in to the GeoEvent administrative API? Browse to https://server-name:6143/geoevent/admin Follow the prompts to obtain a token and log in. Browse to the GeoEvent Definitions endpoint https://server-name:6143/geoevent/admin/geoeventdefinitions How many GeoEvent Definitions do you have? I deliberately created over 1800 event definitions, which introduced significant delays in responses to my REST requests (5 - 7 seconds) ... but I was still able to get the browser to list all of the event definitions. It is possible to use a browser plug-in (e.g. Chrome Poster) to make an HTTP/POST request on the following admin endpoint: https://majere.esri.com:6143/geoevent/admin/geoeventdefinitions/deleteByIds/.json ... But you will need to specify the GUID associated with each GeoEvent Definiton (not the event definition name) in the body of the request as a comma separated list of strings. Building that request for thousands of items is going to be a pain. Refer to the attached illustration. Do you have any event definitions you particularly care about? It's probably going to be easier to just use the GeoEvent Manager and select 'Delete All'. You might have to do this a couple of times. I'm not sure why; maybe a "Delete All" which runs into one of the event definitions which cannot be deleted (incident and TrackGap) halts the deletion. You can also use your favorite browser plug-in to do this administratively. Just make an HTTP/DELETE request on the following endpoint (as many times as necessary to actually delete all of the event definitions). https://majere.esri.com:6143/geoevent/admin/geoeventdefinitions/.json Refer to my second screenshot / illustration attached... - RJ
... View more
08-19-2015
07:22 PM
|
0
|
1
|
825
|
|
POST
|
Hello S R - Based on my experience working with external RDBMS like Oracle (which is limited), I don't think the issue you are encountering is GeoEvent specific. However, here is some advice I received from one of the database administrators in Esri Professional Services. 1. To specifically address the message you quote below ... connect to the database as a data_editor, then delete the records from DATA.VEHICLES. You may find that you have to have your database administrator release locks on the data (refer to following bullet). 2. Never use a data owner or administrative connection to publish feature services Your database administrator should provide you data owner credentials for creating new feature classes (tables) in the database You should disconnect and reconnect as a user with a data_editor role before publishing a feature class as a feature service. Use or data_viewer role if you will not be adding, editing, or deleting features (not generally the case when using GeoEvent). A data_editor has select, insert, update, and delete privileges on the table A data_viewer has only select privileges What happens is that clients using the published feature service end up inheriting the publisher's privileges. So publishing as a data owner means that a client (e.g. GeoEvent) which interacts with the feature service to add features, update features, or delete features has the potential to acquire a full-table lock. When another client tries to access the data, through the same feature service, they also make their attempt as a data_owner, potentially acquiring a full-table lock. This can produce a deadlock situation where two data owners each have locks on the table, and end up locking each other out. 3. You may find that the table's indexes have become fragmented. You may need to have your database administrator gather new statistics, reorganize the tables, and rebuild table indexes. Index fragmentation can quickly happen if the table has undergone large bulk inserts, updates, and deletes. This affects your DML (data manipulation language) performance ... e.g. insert, update, and delete operations. You may need to have your database administrator run a database trace to investigate the DML execution plans in order to identify and address performance issues. Hope this information helps - RJ
... View more
08-19-2015
06:15 PM
|
0
|
0
|
490
|
|
POST
|
Hey Sharon – Since the out-of-the-box ‘Receive Text from a TCP Socket’ input is configured to run in SERVER mode I would recommend you: Navigate to the Site > GeoEvent > Connectors page in the GeoEvent Manager Use the filter pull-down to show only the ‘Inbound’ connectors Click to copy the existing ‘Receive Text from a TCP Socket’ connector Edit the descriptive properties as illustrated … then save your customized copy of the input. Unfortunately, you need to open, edit, save then re-open your copy of the input to work around a bug in the UI. Once you’ve made the descriptive edits above and saved, click to edit your customized copy of the input: Toward the bottom of the input connector’s configuration, move both the ‘Host’ the ‘Mode’ properties from ‘Hidden Properties’ to ‘Advanced Properties’. You can move these properties up the list of advanced properties to place them next to the ‘Server Port’ if that makes more sense to you. Double click the ‘Mode’ property and change it to overwrite the default value with a preferred setting of ‘Client’ and save the setting. Save your customized copy of the input Now when you create an instance of an input using this connector as a template you will see that it has been configured to run in CLIENT mode and has properties which should allow you to specify the server and the port you to which you want to connect … as a client. Watch your CPU and other system resources. We had a bug at the 10.3 release, which I think was addressed at 10.3.1, in which configuring a TCP/Text inbound connector to run in CLIENT mode was misbehaving (polling the socket on the external server too frequently) and consuming all of the Server’s CPU resource. Hope this information helps – RJ
... View more
08-18-2015
07:53 PM
|
0
|
0
|
717
|
|
POST
|
Hello Dominik - The tutorial you referenced in your post above is seriously out-of-date. A couple of examples I spotted when skimming through it -- we no longer have an external GeoEvent Service Designer. The ability to design and publish GeoEvent Services was integrated into the GeoEvent Manager back at the 10.2.1 release. Also the GeoEvent Cache output has been depreciated, so you will not find it available out-of-the-box if you were to try and re-create the 'Poll an External website for JSON' exercose without using the product XML included with the tutorial. I apologize if information you took from the tutorial's exercises resulted in any confusion. I do not think you are able to use a GeoEvent service in the way you describe. A GeoEvent service expects event data to be sent to one of its inputs ... not to receive a request for event data. From this perspective it makes sense that GeoEvent inputs do not honor HTTP / GET requests. They expect HTTP / POST requests with event data in the body of the request. The language here is kind of funny. You're not really making a "request" of a GeoEvent service ... like you might "ask" or "request" a feature service return to you features which satisfy a particular WHERE clause. With GeoEvent you are sending data to an input's adapter for the adapter to parse and use to create a GeoEvent. An event stream is then subject to filtering and processing by a GeoEvent service. The processed event(s) are then sent to an output. A GeoEvent service is not really handling "requests". It is receiving, filtering, processing event data ... then sending that data out an output. I believe if you were to HTTP / POST a request with information included as parameters in the request's header, the parameter information would be ignored. The URL is sufficient to deliver the request to a GeoEvent input ... and the input's adapter is only interested in taking information from the request's body, not from parameters accompanying the request. If I have missed the point of your question, maybe if you can describe what you would like to see a GeoEvent service do with latitude and longitude coordinates (and other event attributes) it receives, I can suggest an approach. I don't think you'll be able to send event attributes as parameters ... they will have to be sent in the body of an HTTP / POST request. Hope this information helps - RJ If there are particular tutorials which you would like to see updated, or content you expect in a tutorial's exercise which is missing, unclear, or appears to be out-of-date, please email the GeoEvent product team (geoevent@esri.com) and include GeoEvent Tutorial Feedback in the subject of your email.
... View more
08-18-2015
04:56 PM
|
0
|
0
|
1227
|
|
POST
|
Hello - Assuming you are interested in continuing to have mobile devices connect via web socket, you will probably want to use a GeoEvent stream service output. There is no built-in concept of addressing event data broadcast by a stream service output to a specific recipient, single mobile device, or group of devices. A client subscribing to the the stream service generally receives all event data being broadcast by the output. If it is acceptable that clients subscribing to the stream service can potentially receive all event data, but prefer to receive only data intended for them, they can use attribute filtering supported by the web socket to configure the connection. The event data would, of course, need to include an attribute the subscribing client could include in its filter. To implement this you will probably need to develop a solution using the ArcGIS API for JavaScript to bring event data broadcast by the stream service in as a stream layer. That way you can use the setDefinitionExpression method (on the StreamLayer class) to configure each client/server web socket connection such that only data with a client-specific attribute is sent from the server to a specific client. If you do not want to develop a client-side solution, you could configure a GeoEvent Service to handle the filtering. You would configure a Filter element within a GeoEvent Service such that only messages for a particular client (based on the client identifier) were sent to a particular stream service output. The GeoEvent Service would then have separate outputs for each potential client. Individual clients could then be directed to subscribe to a specific GeoEvent output's stream and receive all of the data being provided by that output. The downside to the server-side approach is that you would have to "hard code" filters into a GeoEvent Service to handle all of the different client identifiers. This isn't necessarily a problem if you have a fixed, well-known set of subscribers -- but could become a headache if you have to update your GeoEvent Service every time you want to add a subscriber to the solution. Hope this information helps - RJ Note: A draft of the technical workshop 'Leveraging Stream Services' slides from the 2015 User Conference are available at the following link: Real-Time GIS: Leveraging Stream Services. Videos of the technical workshops should be available to review and download fromEsri's E80 site in a few weeks.
... View more
08-18-2015
03:12 PM
|
1
|
0
|
748
|
|
POST
|
Hello Arun - The issue you’re seeing, in which files are written to the following system folder, is a known 10.2.2 behavior. …\data\activemq\localhost\localhost\tmp_storage When a GeoEvent Service's inbound connector receives more event data than the GeoEvent Service is able to process the unprocessed events are held in memory, until the product's cache is full. Unprocessed events then get written out to disk. ActiveMQ was not doing a very good job of cleaning up event data it had persisted to disk … which is why you are seeing files begin to pile-up in the tmp_storage folder. There is not much you can do at the 10.2.2 release to address this. You can edit the com.esri.ges.messaging.jms.cfg file, found in the GeoEventProcessor\etc folder) to increase the memory limits so GEP will hold more messages in memory, so that not so many files are written to disk. I've attached a screenshot showing the properties ... but these edits are really just masking the issue. If an event provider is consistently sending GeoEvent more event data than its GeoEvent Services can reasonably handle the event backlog will grow and eventually cause the system to fail. If unprocessed events are being dumped to disk, you will likely see something in the karaf.log along the lines of: "org.apache.activemq.broker.region.Queue" … "Usage Manager Memory Limit reached. Producer (ID:XXX) stopped to prevent flooding queue". By the time I've seen messages like the above logged I've already pushed GeoEvent into a bad state. I've had to stop GeoEvent Windows Service, delete everything from beneath …\GeoEventProcessor\data (to forcefully reset the product) and then restart. The better solution is to upgrade to the 10.3.1 product release. The product does a much better job of managing its event backlog at the 10.3 and 10.3.1 releases, working to reduce the backlog during periods in which event data is not being received. Testing high volume / high velocity data for 10.3 we found that adding 4 - 8 GB of premium RAM to a system significantly increased event throughput. One of our holistic testers, in an Esri lab, managed to saturate the network card before depleting the system's RAM. (They ended up crashing the system, so remember that there are system resources in play other than RAM, DISK I/O, and CPU.) Hope this information is helpful - RJ
... View more
08-17-2015
07:14 PM
|
0
|
1
|
784
|
|
POST
|
Since the feed has no coordinate information, it doesn't make logical sense to add / update features in a feature service. But we don't need to let that stop us. I published a feature service with an attribute schema matching the feed's attributes. I imported an event definition from the feature service and deleted the objectid and geometry fields -- since I have no data for these coming from the feed. I created an 'Update a Feature' output, using <guid> as the unique field for updates, and began using the feature service to cache data received from the RSS feed. I noticed that the feed's data is advertising its date/time values as EST ... so they're placed into the feature class as UTC values (which have a a five hour offset from EST). I was able to then configure a 'Poll an ArcGIS Server for Features' input to use the same tailored event definition (sans objectid and geometry) to obtain retrieve just the new records from the feature service. Good work around - thanks for suggesting it. - RJ
... View more
08-14-2015
11:59 AM
|
1
|
0
|
1283
|
|
POST
|
Hello K - Good to meet you out in San Diego at the UC. Here's some information I got from one of our developers on your question. The 'Receive RSS' input is using standard HTTP header fields to determine whether or not data in a response to a poll is "new". Specifically, the input is relying on the "Last-Modified" header as described in Section 14 of RFC 2616, "Header Field Definitions", for the HTTP/1.1 protocol. The input does not cache a date/time value like the 'Poll an ArcGIS Server for Features' input, so it cannot poll for data which is newer than a cached value. If FEMA's feed doesn't include the needed header the input will fail-over to poll for all data. I suppose you could use a Field Calculator to obtain an event's receivedTime() and then compute a difference between that and the event's advertised publish date, using a filter to discard anything deemed "old" ... but that approach seems prone to error. You're probably going to receive duplicates you don't want and might filter events you actually do want.
... View more
08-14-2015
11:57 AM
|
1
|
0
|
1283
|
|
POST
|
Hello Sebastien - On your first question, if you are using an ArcGIS.com web map to display feature data broadcast from a stream service, your only option is what we would refer to as "Single Symbol" symbology in ArcMap. You cannot configure a set of coded values - 0, 1, 2 for example - to display as "red", "yellow", and "green" dots. You could cheat, of course, and add your stream service to your web map three times, as three separate layers, then use a filter (configured on the stream layer) to display only items whose coded value is 0, 1, or 2. Then you could configure each layer's "Single Symbol" symbology to be either "red", "yellow", or "green" dots. (See my screenshot, attached) On your second question, if you have a field tagged TRACK_ID in the GeoEvent Definition being used by the stream service output, existing graphics displayed by a web map's stream layer will be updated as event data is broadcast by the stream service (vs. new graphics created for each event). However, the cheat above, where you display the "red", "yellow", and "green" as separate stream layers defeats the TRACK_ID updating because each layer will update its graphic, but the old signal value is still displayed by the other layer. I think your combination of requirements, to have control of your stream layer symbology to create unique value rendering and to not display graphics overtop of one another, you are going to have to use the JavaScript API and develop your own web mapping application. Hope this information helps - RJ
... View more
08-13-2015
06:09 PM
|
3
|
0
|
3745
|
|
POST
|
Hello Kevin - Thanks for reaching out. I am going to need some additional information from you. I'm assuming you have developed a JavaScript application using the JQuery library ... and that you are able to use your application to POST JSON to the REST endpoint hosted by the GeoEvent Extension, running your application on your desktop, and the intended GeoEvent input receives the JSON event data. But when you run your application from the iPad you are returned some sort of HTTP error? Can you tell me what the returned error message is? I'm not familiar with Cross-origin resource sharing (CORS). I read a little about it on the HTTP access control (CORS) page on the Mozilla Developer Network. Usually when I'm asked about a client making an HTTP POST request to GeoEvent the issue is that the server is behind some sort of firewall and the client (or external server) cannot reach the GeoEvent REST endpoint. The issue is normally that the client or external server simply cannot see the server running GeoEvent, not a cross-domain restriction on specific resources. I spoke with one of the developers who suggested to me that CORS is enforced by the browser, on the client, to prevent malicious code from being downloaded to the client. I don't think GeoEvent Extension particularly cares who or what is posting to its input's REST endpoint. The input's transport delivers a payload to the adapter, and if the adapter can interpret the received JSON it proceeds to create a GeoEvent. Using a Firefox browser, I used the poster-extension to make an OPTIONS request on the administrative endpoint for my GeoEvent input. I was returned an HTTP 200 / OK message that the operations POST, GET, DELETE, PUT, OPTIONS, and HEAD were all allowed. The person I'd like to review this with is unavailable until August 17th. I don't have any examples on enabling CORS for GeoEvent. I'll see what I can do to have someone look into JQuery POST requests and get back with you. Best Regards - RJ
... View more
08-07-2015
06:53 PM
|
0
|
1
|
863
|
|
POST
|
Hello Rainer - The issue you referred to -- inbound JSON over TCP fails at high velocity when more than one message every 100ms is sent -- is being worked by a developer, Morakot, on the GeoEvent team. Morakot has been contact with Thorsten Braun from Geosecure as well as Roman Starý from Esri Germany this week (7-Aug 2015). We are aware of the priority with regard to Geosecure's upcoming functional acceptance test and I'm sure that Morakot is making every effort to prepare a fix and deliver something for on-site validation tests. Best Regards - RJ
... View more
08-07-2015
05:27 PM
|
0
|
0
|
1796
|
|
POST
|
Hello Giancarlo - It looks like the ActiveMQ for GeoEvent item on the Gallery still has the 10.2.x version of the transport. I will work with the team to get an updated compilation of the JAR uploaded to the Gallery so that you can download it. If you have developer resources available to you, you can also find the transport on GitHub in the Esri / activemq-for-geoevent repository. You can download the code from there and recompile the transport for the 10.3.x product release. Hope this information helps - RJ
... View more
08-07-2015
04:46 PM
|
0
|
0
|
629
|
|
POST
|
Hello Dominik - You need to look at the schema of the feature class you have exposed through your published feature service. The original feature class, loaded into ArcMap from the map packages included with the tutorial, has a specific set of fields. You can review the feature service's schema from its endpoint in the ArcGIS REST Services Directory. For example: http//localhost:6080/arcgis/rest/services/.../Flights/FeatureServer/0 Fields: objectid ( type: esriFieldTypeOID , alias: OBJECTID , editable: false , nullable: false ) flightnumber ( type: esriFieldTypeString , alias: FlightNumber , editable: true , nullable: true , length: 50 ) starttime ( type: esriFieldTypeDate , alias: StartTime , editable: true , nullable: true , length: 36 ) originairportcode ( type: esriFieldTypeString , alias: OriginAirportCode , editable: true , nullable: true , length: 50 ) destinationairportcode ( type: esriFieldTypeString , alias: DestinationAirportCode , editable: true , nullable: true , length: 50 ) aircrafttype ( type: esriFieldTypeString , alias: AircraftType , editable: true , nullable: true , length: 50 ) altitude ( type: esriFieldTypeInteger , alias: Altitude , editable: true , nullable: true ) When you configured your Field Calculator to calculate AltitudeInMeters, and had the processor add that as a new field to the GeoEvent being processed, that added a field (to the event) which does not exist in the feature service you originally published. You need to go back to ArcMap, add the new field to your original feature class, and then publish a new feature service ... or select to overwrite the existing feature service. Does that make sense? You cannot use GeoEvent to alter a feature service's schema. You use GeoEvent to filter, process, and enrich event data. You must then map the processed event data to match the schema of a feature service so that you update features using data available in events you have processed. Once you have published a feature service which has both an Altitude field and an AltitudeInMeters field then, in GeoEvent, you would import a GeoEvent Definition from the new feature service ... so that you can Field Map event data from the event definition created by the Field Calculator to the event definition imported from the newly published feature service. Hope this information helps - RJ
... View more
08-07-2015
02:14 PM
|
1
|
1
|
1778
|
|
BLOG
|
Hey Ben - I love your idea of using a 'Poll an ArcGIS Server for Features' input, whose ‘Query Definition’ has been set to poll features whose TO_DELETE attribute has been set to 1, and having the input delete any features it has polled. I’d never considered using an input to perform a data cleansing task this way, after updates have been made the features by an output. This illustrates that inputs, as runnable components, can manipulate a dataset (by deleting data) even if the input has not been incorporated into a GeoEvent Service. Simply creating the input and allowing it to run will result in records flagged for deletion to be deleted; you do not have to route the input’s events through any filtering or processing or send them to an output. You mention that configuring an output to do this deletion would be difficult if the event data didn’t include a timestamp. You could configure a Field Calculator with an expression which invokes the function receivedTime() to write the date/time an event was received into a new field – or overwrite the value in an existing field if you cared more about the date/time GeoEvent received the event than the date/time a provider associated with an event. On that latter point, using a Field Calculator to overwrite a value in an event’s Date field, you could artificially mark a feature as “old” by having the Field Calculator subtract a number of milliseconds from receivedTime(), then update features with the artificially aged date/time. Now when the output polls to look for features which ought to be deleted, it will find features which look “old” and delete them. You are correct, though. Forcing clients to figure out how to determine “now” in order to filter features which are some number of minutes older than “now” so that they are not displayed is harder than configuring a map layer to filter features whose TO_BE_DELETED attribute has been set to 1. That’s another reason I liked your use of the input to delete “old” features… - RJ
... View more
08-07-2015
01:12 PM
|
1
|
0
|
244
|
|
POST
|
Hello Mike - I've branched your question to a new thread, as you are asking for advice on bringing data into GeoEvent rather than using a stream service to broadcast data out from GeoEvent. When working internally, behind your firewall, data you HTTP/POST to a GeoEvent hosted REST endpoint would need to be sent to the default port 6180. For example: http //server-name:6180/geoevent/rest/receiver/rest-json-in On another thread - Geoevent Processor and Verizon Networkfleet - Daniel mentions that your IT department might be able to create a specific tunnel for your device to reach the default port 6180 on the server running your GeoEvent Instance. I also describe, on that thread, how you might use a pair of GeoEvent instances, one running in the cloud, to relay data from a provider into your organization. One of Esri's solution engineers asked what sounds like a similar question to the one you pose. He indicated that he had a public facing Web Server with a JavaScript application submitting event data to GeoEvent via a ‘Receive JSON on a REST Endpoint’ inbound connector. The public facing server was actually a Reverse Proxy (Apache) to an ArcGIS/GeoEvent Server (IIS) communicating with GeoEvent using port 6180 in the httpd.conf file: ProxyPass /geoevent http://10.61.3.5:6180/geoevent ProxyPassReverse /geoevent http://10.61.3.5:6180/geoevent With a reverse proxy in place the URL for the JSON request does not contain the port. So rather than sending data to http//host:6180/geoevent/rest/receiver/<name> you would send it to something more like http//proxy/geoevent/rest/receiver/<name> . The advice provided was to install the Apache Module mod_proxy_wstunnel to enable the reverse proxy to handle websocket tunneling. One of the developers on the GeoEvent team evaluated the Nginx reverse proxy, so I don't have the exact syntax needed to configure the Apache reverse proxy. You would need to create a rule something like the following so that websocket connections from the client would get routed to the web server listening for those connection requests: ProxyPass /arcgis/ws ws://10.61.3.5:6180/arcgis/ws ArcGIS Server’s REST handler would need to be configured to provide the correct Websocket address to clients, without the port number embedded in them. To do this you need to use the AGS Admin Directory to add a property to the Server to register your proxy of choice. (Refer to the attached illustration...) If you are working with a machine where the HTTP port for the GeoEvent Extension has been changed from the defaults (6180 / 6143) to something else (e.g. 8080 / 8443) … be sure to specify the correct port when updating the ArcGIS Server properties above. Hope this information is helpful. - RJ
... View more
08-07-2015
11:28 AM
|
0
|
2
|
3100
|
| Title | Kudos | Posted |
|---|---|---|
| 1 | 01-05-2023 11:37 AM | |
| 1 | 02-20-2025 03:50 PM | |
| 1 | 08-31-2015 07:23 PM | |
| 1 | 05-01-2024 06:16 PM | |
| 1 | 01-05-2024 02:25 PM |
| Online Status |
Online
|
| Date Last Visited |
3 hours ago
|