|
POST
|
My apologies Varun that your question has gone so long without a response. Stream services provide a smooth visualization experience. If you have not yet found an approach that works for you, I would recommend designing a GeoEvent Service which incorporates a 'Poll an ArcGIS Server for Features' input configured to poll your map service every five seconds (or so). I would configure the input to 'Get Incremental Updates' to avoid receiving all features the map service has to offer every time GeoEvent polls for data. The simplest GeoEvent Service would not include any filtering or processing. Just direct the event data received by the input to a 'Send Features to a Stream Service' output. If you locate your published stream service in your ArcGIS REST Services Directory you should see a link which will allow you to view the feature data being broadcast by the GeoEvent stream service output on a JavaScript web map. The GeoEvent tutorials contain step-by-step examples and illustrated workflows. Please take a look specifically at the Stream Services in GeoEvent tutorial. You may also find the video recording of the Building Real Time Web Applications technical workshop helpful. Hope this information helps - RJ
... View more
01-21-2016
10:09 AM
|
0
|
0
|
530
|
|
POST
|
Good news on this ... the Message Formatter (used by the 'Send an Email' outbound connector) was enhanced at 10.4 to support HTML code. When configuring the output, use the new drop down to select HTML, and e-mail messages you send will display using HTML code you enter into the GeoEvent output connector's message body. For example: Sending the following event data to GeoEvent (using a 'Receive JSON on a REST Endpoint' inbound connector): {
"Customer_Logo": "http://webapps-cdn.esri.com/Apps/MegaMenu/img/logo.jpg",
"Issue_Header": "Enhance Message Adapter with the ability to handle HTML content",
"Tech_Advisor": "Ming Zhao",
"Date_Reported": "June 2, 2014",
"Customer_Name": "Becks Hybrids",
"Customer_No": "123-45-6789",
"Contact_Phone": "1-800-937-2325",
"Customer_Address": "6767 E. 276th St.",
"City_State_Zip": "Downs, IL 61736"
} And entering the following HTML code into the 'Send an Email' output's 'Message Body' text box: <img src=${Corporate_Logo} alt='xxx' width='80'/>
<h1 style='font-family:Segoe, Arial, sans-serif;font-weight: 700;'>${Issue_Header}</h1>
<table style='font-family:Segoe, Arial, sans-serif;font-weight: 400;border-collapse:collapse;'cellpadding='5'>
<tr style='background-color:lightgrey;'>
<td style='font-weight:600;' width='130px'>Technical Advisor:</td>
<td width='140px'>${Tech_Advisor}</td>
<td width='50px'>
</td>
<td style='font-weight:600;' width='150px'>Date Reported:</td>
<td>${Date_Reported}</td>
</tr>
<tr>
<td colspan='5'>
<table style='width:100%; border-bottom-style:solid; border-bottom-width:3px; border-right-style:solid; border-right-width:3px; border-bottom-color:#888888; border-right-color:#888888;'>
<tr style='font-weight:700; font-size:larger; background-color:lightgrey'>
<td colspan='5'>Customer Information</td>
</tr>
<tr>
<td style='font-weight:600;' width='130px'>Customer Name:</td>
<td width='160px'>${Customer_Name}</td>
<td width='50px'>
</td>
<td style='font-weight:600;' width='150px'>Customer Number:</td>
<td width='160px'>${Customer_No}</td>
</tr>
<tr>
<td style='font-weight:600;vertical-align:text-top' width='130px'>Customer Address:</td>
<td width='160px'>${Customer_Address} <br/> ${City_State_Zip}</td>
<td width='50px'>
</td>
</tr>
<tr>
<td style='font-weight:600;vertical-align:text-top' width='130px'>
</td>
<td width='160px'>
</td>
<td width='50px'>
</td>
<td style='font-weight:600;vertical-align:bottom' width='150px'>Phone:</td>
<td style='vertical-align:bottom' width='160px'>${Contact_Phone}</td>
</tr>
</table> Should result in a nicely formatted table being sent, via e-mail, to a specified recepient This was a 10.4 product enhancement. - RJ
... View more
11-17-2015
03:19 PM
|
3
|
3
|
2084
|
|
BLOG
|
What is GeoEvent doing when it receives event data When an inbound connector (input) receives an event payload, an adapter parses the event data and constructs a GeoEvent. It uses a GeoEvent Definition (which specifies the data structure of each event) to do this. An input passes the GeoEvents it constructs to a GeoEvent Service for processing. GeoEvents are passed from an input to a GeoEvent Service using a message bus. The message queuing implementation is internal to the product and abstracted from the user (the 10.3 / 10.3.1 releases use RabbitMQ). The GeoEvent Service applies event filtering and processing to the events it receives then delivers the processed GeoEvents to an outbound connector (output) for broadcast. In order to add/update features in a Geodatabase, a GeoEvent outbound connector leverages REST endpoints exposed by a published feature service. The features are actually stored in an underlying feature class, in a Geodatabase. So adding and updating features actually involves HTTP calls and backend database transactions. When the GeoEvent extension was first released, event data had to be persisted in a GeoDatabase and exposed through a published feature service before a client could visualize the data on, say, a Web Map. At the 10.3 release of the product we introduced the concept of a Stream Service … but more on that in a little bit. Event Throughput / System Performance When considering system scalability and the GeoEvent extension, most folks are thinking about the number of event messages they need to process each second. A simple GeoEvent service with a single TCP/Text input sending data out to a TCP/Text output can process thousands of events per second. But such a simple service isn’t doing very much … it isn’t undertaking any computational analytics or spatial relationship comparisons (which can be CPU intensive), it isn’t transacting with a feature service to enrich event data with attributes from a related feature (which involves network latency), and it isn’t doing anything with GeoFences (which can use a lot of available RAM). Improvements made to the product for the 10.3 release focused on maintaining a high level of GeoEvent throughput when real-time analytics such as spatial filtering (e.g. GeoEvent INSIDE GeoFence) were incorporated into a GeoEvent Service. Just as important than the number of events you expect to receive each second, or the filtering/processing you plan on performing, is the overall size of each event in bytes. You can easily process 1000 events per second on a single server when the events are fairly small – say just a few simple attributes with a point geometry. If your events include several polygon geometries each with many dozens of vertices and/or complex hierarchical attribute structures with lists/groups of values, then each event's "size" will be much larger. More system resources will be required to handle the same number of events and your events per second throughput will be reduced. Most folks want to use the event data to update features in a feature class through a published feature service. Transactional throughput with a feature service is an acknowledged bottleneck. When you configure a GeoEvent output to add/update features in a feature service you can expect to be limited to just a couple hundred events per second. There are a couple of parameters you can adjust on your GeoEvent outbound connector to throttle event output as you prototype your event transactions, but you typically don’t need to if you are below the 200 – 300 events per second threshold we recommend for updating features in a locally hosted feature service. Please note that event throughput is further restricted when updating a hosted feature service within an ArcGIS Online Organization account. Regarding system performance, say you were to receive a burst of 1000 events on an input. If you knew, from testing you had conducted, that your local server and database could only update 240 events per second, then you can assume that GeoEvent will need to create a backlog of events and work to process events off the backlog. The 10.3 product release does a better job of this than the 10.2.2 release. Say your data was not a single burst, but that you expected to receive a sustained 1000 events per second, and your output was still only handling 240 events per second. At 10.3 you can expect that the GeoEvent performance will degrade sharply as the backlog begins to grow. GeoEvent will work to clear the backlog, but it will continue to grow until a system resource becomes saturated and the system becomes unresponsive. This is the behavior we observed during the GeoEvent 10.3 Holistic Event in DC. It could be that you run out of system RAM, it could be that you saturate the network card. If you are not processing events as fast as you are receiving them you will have a problem. Stream Services Stream Services, available with the GeoEvent 10.3 release, provide an alternative to updating features in a feature service. The key when thinking about Stream Services is to separate in your mind data visualization and data persistence. We have another tutorial, Stream Services in GeoEvent, which you might want to take a look at if streaming data out for visualization without persisting the data in a geodatabase is something that you are interested in pursuing. Stream Services rely on WebSockets. The JSON broadcast from a Stream Service can be received by a custom JavaScript application; the Stream Service concept is supported by a developer API. They can also be added to Web Maps as feature layers to display the event data in near real-time. The reason behind the development of the Stream Service was that, without them, your only option for data visualization was to first persist your event data as features in a Geodatabase feature class, then have your client application query the feature class through a published feature service. An ArcGIS Server instance running on a local GIS Server with a local RBDMS supports up to about 240 events per second. That’s a 1/10 th of what a typical GeoEvent Service is able to process each second. Streaming the event data out as JSON was one way to provide data visualization for folks who didn’t require data persistence. WebSockets also allow a client to specify spatial and/or attribute filters which the server will honor. This allows network bandwidth to be preserved by limiting the data being passed over the socket connection. It is up to the client application to know what it can support and specify filters which limit the amount of data the client will receive. In the case of a custom JavaScript web application, if too much information is sent from GeoEvent to a browser's web application, the browser will simply freeze and may crash. The socket connection will close when the client dies and GeoEvent will continue broadcasting events through the Stream Service for receipt by other clients (currently connected or in the process of connecting). System Resources / System Sizing There’s actually quite a bit you need to consider when thinking about scalability. Physical Server When leveraging virtualization, in my experience, it is most important that the physical server hosting the virtual machines adequately support the virtualization. The physical server is going to need considerable memory and a substantial processor if you plan on instantiating multiple VMs each with 4 CPU cores and 16GB of RAM. Also, the more physical servers you incorporate in your architecture, the more important the network card will become. At the 10.3 product release we support high-availability and high-velocity concepts by configuring an ArcGIS Server cluster with multiple servers and installing GeoEvent on each server in the cluster. We have a Clustering in GeoEvent tutorial which introduces these concepts. RAM In a Holistic lab we conducted back in September, we discovered that the type of RAM is as important as the amount. It's not sufficient to consider a server as simply having 16GB of RAM. Premium DDR4 SDRAM will provide 2133 – 4266 MT/s (million transfers per second) whereas DDR3 RAM from just a couple years ago will provide only 800 to 2133 MT/s. You may not have any control over the type of RAM in the physical server hosting the VMs being provided to you – but it matters. If you are importing GeoFences into the GeoEvent processor, know that these geometries are all maintained in memory. If you have thousands of complex GeoFences with many vertices, that is going to consume a lot of RAM. Significant improvements were made at the 10.3 product release to allow GeoEvent to better handle the spatial processing needed to determine the spatial relationship between an event’s geometry and a GeoFence, so event throughput is much better – but a high volume of spatial processing can consume significant CPU. CPUs The number of CPU cores is generally important once you being designing GeoEvent Services with actual processing – it is not as important when benchmarking raw event throughput. For example, a benchmark taking event data from a TCP socket and sending the data to a TCP socket doesn't require much CPU; a large amount of premium RAM is more important in this case. Projecting an event's geometry, enriching events, calculating values – these analytics will all place load on your CPU resource. I wouldn't be surprised to learn that a physical server with only 4 cores and 32GB of premium RAM outperformed a virtual cluster of three VMs each with 4 cores and 16GB of RAM. The hosting server might be an older machine with DDR3 or DDR2 generation RAM. The hosting server might be supporting other virtualization. Network connections between physical machines might benefit from an upgrade. Given the above, you can probably understand why we recommend that you dedicate an ArcGIS Server site for real-time event processing. You might have heard this referred to as a "silo'd" approach to your system architecture in which one ArcGIS Server site is set-up to handle your map service, feature service, geoprocessing, and map tile caching with a second ArcGIS Server site set-up for real-time event processing and stream service publication. There are many factors you will need to consider when making system architecture and hardware purchasing decisions. Videos from technical workshops presented at our Developer Summit in Palm Springs as well as the International User Conference in San Diego are available on-line at videos.esri.com ... search for geoevent best practices to find a video which presents some of these considerations. The above is, of course, specific to the ArcGIS GeoEvent Extension for Server. A much more comprehensive look at system architecture design strategies is provided by Dave Peters in his blog: System Architecture Design Strategies training reference material. Hope this information is helpful - RJ
... View more
11-11-2015
11:12 AM
|
13
|
3
|
7431
|
|
POST
|
In the following illustration, notice that the option to Add layer to new map with editing enabled is not being presented. ArcGIS Server / Portal / ArcGIS Online administrators should be clear on the following points: Users with access to the ArcGIS Server site can always navigate to the feature service's page in the ArcGIS REST Services Directory and use the Add Features, Update Features, and Delete Features endpoints and make edits administratively using the REST Services API. Only services listed when a user clicks 'My Content' in the ArcGIS Online for Organizations portal will be listed by GeoEvent when configuring an add / update feature output, clicking to import a GeoEvent Definition (etc.) ... Items which have been shared with you are not listed by the GeoEvent Extension by design. Users with access to the GeoEvent Manager can configure an add / update features output by using the the Default GeoEvent Data Store connection to access a locally published feature service. Users who do not want to maintain a locally published map service and feature service, but want to use a hosted feature service instead, must enable editing so that GeoEvent can discover the REST endpoints needed to add / update features ... otherwise GeoEvent cannot be used to add or update features in the hosted feature service. GeoEvent, by design, is not allowing users to work around the fact that editing has been disabled on a hosted feature service by accessing REST endpoints which are not exposed by the service. The endpoints exist, but GeoEvent does not have a comparable workflow to Add layer to new map with editing enabled (refer illustration below). The GeoEvent Manager will display a validation error when an add / update feature output targets a local service whose Feature Access capability has not been toggled "on". The same validation error is displayed when an add / update feature output targets a hosted service whose Editing property is Disabled. The validation error reads: Status: Error - Validation Error: Transport not started because Create capability is required by the output but is missing from the layer in the feature service. Named users who have published an AGOL hosted feature layer do not need to specifically enable editing ... AGOL allows them to Add layer to new map with editing enabled : Hope this information was helpful - RJ
... View more
11-03-2015
05:10 PM
|
0
|
0
|
1400
|
|
POST
|
Matt – Thanks for your feedback and idea. The GeoEvent Extension deliberately, by design, does not allow add / update feature operations on feature layers when editing has not been explicitly enabled. For locally published services you must toggle the Feature Access For services you publish as hosted by ArcGIS Online you must specifically enable editing by selecting to Enable editing and specify that editors are allowed to Add, update, and delete features. If your goal is to restrict organization members and prevent them from editing features, it is recommended that you publish a feature service to your local ArcGIS for Server site (with the Feature Access capability enabled) and then add an item which references the locally published map service to your ArcGIS Online for Organizations portal. Refer to the steps outlined below: From the My Content page, select Add Item > From the web and enter the URL of your locally published map service (e.g. https://my-server:6443/arcgis/rest/services/service-name/MapServer/0). Expose the non-feature-editable AGOL Feature Layer to others in your Organization by sharing the item. Named users will be able discover the feature layer and include the layer in a web map, but they will not be able to add/update features using a web map they open through the ArcGIS Online for Organizations portal. If your goal is to allow members of your organization to edit features using a web map, but not necessarily share the feature service from your local ArcGIS for Server site publicly, it is recommended that you add an item to the locally published feature service to your ArcGIS Online for Organizations portal and then share the feature service only with members you specifically invite to join a group. From the My Content page, select Add Item > From the web and enter the URL of your locally published feature service (e.g. https://my-server:6443/arcgis/rest/services/service-name/FeatureServer/0) to add a reference to the feature service to your Organization's content. Do not expose this second item by sharing it with the "Everyone" - share it with specific members through an established Group. Group members who select to add this second item to a web map will have the 'Edit' button available and will be able to add / edit / update features using the web map.
... View more
11-03-2015
05:08 PM
|
0
|
0
|
1400
|
|
POST
|
This discussion is being re-posted from a user idea submitted to the ArcGIS Ideas portal. http://ideas.arcgis.com/ideaView?id=087E00000004sAHIAY When using ArcGIS Online, it is possible to edit feature services that are non-editable if you are logged in as an administrator. GeoEvent Processor cannot add or send updates to a non-editable feature service, even when the arcgis.com data store has been set up using an administrator account. There are likely to be use cases where GEP users wish to share feature services with the public but lock down editing to only administrators. If a feature service is made editable, both administrators and publishers are able to edit it - this is a concern in regards to accidental edits. This 'idea' proposes that GEP be able to update non-editable feature services, when the service is accessed through an AGOL administrator data store. Posted by mtjones to ArcGIS Server Sep 30, 2013
... View more
11-03-2015
05:07 PM
|
0
|
2
|
4910
|
|
POST
|
Hello Caroline - Looking closely at the error you posted, you are still being told "Unable to perform query". Your underlying issue has not been resolved. GeoEvent is still looking for an OBJECTID for the specified TRACKID values: where=animal_sighted IN ('Cheetah','Leopard','Elephant','Buffalo','Wildebeest') Notice the REST endpoint on which the request is being made: .../FeatureServer/0/query Also, notice the fields being requested: outfields=animal_sighted,objectid Verifying event attributes in your input CSV and using a Field Mapper processor to make sure GeoEvents sent to an 'Update a Feature' output have the correct schema are good workflow debugging steps. Using the GeoEvent Monitor page to observe event counts for your input, GeoEvent Service, and output is also good. However, jumping from this to Operations Dashboard to see if features are being updated in an operational view is premature. As part of your debugging, have you stepped through the narrative on pages 34 - 42 ("Reconfigure the Flights GeoEvent Service") and ("Examine an updated feature service’s REST endpoint") in Module 2 of the product introduction tutorial? I would recommend using a TCP/Text output (and possibly a File/JSON output as well) to see what your GeoEvent Service is producing. You also need to be using the ArcGIS REST Services Directory to examine your feature service's content before expecting Operations Dashboard to show you features being updated. Have you had a chance to try some of the debugging steps outlined in the Debugging the Add a Feature / Update a Feature Output Connectors blog? Understanding the queries and Server responses illustrated in the FeatureServiceUpdate.png attached to that blog -- and observing the same in your GeoEvent's karaf.log -- are important when trying to figure out why features are not being added or updated in a feature service's feature class. Unfortunately, seeing an output's count increment on the GeoEvent Monitor page is not sufficient to conclude that requests made to the feature service, or transactions made with the database, were successful. Until we figure out why GeoEvent is unable to query the feature service to obtain an OBJECTID you will never see requests made to update features (.../FeatureServer/0/updateFeatures). Until you see updateFeatures messages in the GeoEvent log, you can't conclude that GeoEvent has actually requested a feature service update features in its feature class. The web map you have prepared, which the operational view is using, won't be able to tell you anything if features are not being updated in the underlying feature class. Hope this helps - RJ
... View more
09-28-2015
01:44 PM
|
0
|
1
|
9154
|
|
POST
|
Hello Caroline - Here's how you parse this error message: Error Posting to URL:
http://localhost:6080/arcgis/rest/services/Caroline/Schema1/FeatureServer/0/query
with parameters:
where=animal_sighted IN ('Wildebeest','Zebra','Thompson_Gazelle','Giraffe','Lion')
&outfields=animal_sighted,objectid&f=json.
java.io.IOException: {"error":{"code":400,"message":"Unable to complete operation.",
"details":["Unable to perform query operation."]}} at
com.esri.ges.transport.featureService.FeatureServiceOutboundTransport.validateResponse(FeatureServiceOutboundTransport.java:745) Line 2: You are making a query on the locally hosted Schema1 feature service (found in the Caroline services folder) Line 4: GeoEvnet is looking for an OBJECTID (feature row identifier) for the listed TRACKID values ('Wildebeest', 'Zebra', etc.) Line 5: The query should return only the fields: animal_sighted and objectid in a JSON structure Line 6: ArcGIS Server couldn't complete the query ... this is a generic error return code and message from Server Line 8: Identifies the component, method, and line of code in the GeoEvent implementation which generated the exception Since you cannot attach a debugger and look into the GeoEvent source code, your only option is to set DEBUG logging on the reported component and see if you can get some additional information. In GeoEvent Manager, click Logs > Settings and enter com.esri.ges.transport.featureService.FeatureServiceOutboundTransport as the component for which you want the Logger to produce DEBUG messages: Save the logging settings and send / simulate another event so that your GeoEvent Service will try again to add / update features. (I'm assuming that is what you are trying to do, based on the error message you posted.) Check the karaf.log to see if you were able to capture any additional information. I find that opening the logfile using a text editor can be more useful than looking at log messages in the GeoEvent Manager's Logs viewer. You can find the karaf.log file here: C:\Program Files\ArcGIS\Server\GeoEvent\data\log Check your ArcGIS Server logs. There may be some additional detail on why Server was not able to honor the request GeoEvent was making on the feature service's ...\query endpoint. Try rebuilding the query yourself in the ArcGIS REST Services Directory: Browse to http://localhost:6080/arcgis/rest/services/Caroline/Schema1/FeatureServer/0/query Enter animal_sighted IN ('Wildebeest','Zebra','Thompson_Gazelle','Giraffe','Lion') for the WHERE clause Specify animal_sighted,objectid for the Out Fields parameter Click 'Query (GET)' to execute the query You may need to play with your query some, like changing the WHERE clause to 1=1 to query for all records and/or changing the Out Fields to a * (so that all fields are retrieved). Pay particular attention to the URL generated based on the query you construct using the HTML page. You can copy / paste the URL from the ArcGIS REST Services Directory page into a text editor and look through the parameters. I know, for example, that there was a rather obscure defect in the 10.3.1 release which would only manifest when configuring an ArcGIS Server site with Portal and using the ArcGIS Data Store (a specific configuration of PostGRE database). The outFields parameter was being constructed as outfields (little 'f') ... which was not compliant with the ArcGIS REST specification and queries GeoEvent was making to try and obtain OBJECTID values were failing. (This issue should be fixed in the upcoming 10.4 release...) Hope this information helps - RJ
... View more
09-23-2015
04:03 PM
|
2
|
4
|
9154
|
|
POST
|
Hello Richardson - If you have two fields in your GeoEvent that you want to offset from UTC to Localtime, you will need to use two Field Calculator processors. For example, consider the following input (generic JSON which can be sent to GeoEvent via HTTP / POST): [
{
"SensorID": "BZQT-5480-A",
"SensorValue": 53.2,
"ReportedDT": "2015-09-23 14:06:22.6 UTC",
"CalibrationDate": "2015-07-01 00:00:00.0 UTC"
}
] I configured a 'Receive JSON on a REST Endpoint' input with an 'Expected Date Format' property value: yyyy-MM-dd HH:mm:ss.S z The input is now configured to handle the data provider's specific string representation of a date/time, expecting the string value to include the time zone specification (in this case, UTC). If I wanted a client to display date/times as localtime, and the client wasn't configured (or able) to offset the UTC values for me, I would want to create two additional event fields - one to hold the "Reported Date" in localtime and one to hold the "CalibrationDate" in localtime. I suggest this because deliberately falsifying the actual UTC date/time values by overwriting them with computed offsets is bad practice. So, I copy the GeoEvent Definition used by the input to create a new event definition, and add the needed fields to the new event definition. Then I use a Field Mapper to map the received data into the new schema, leaving the two localtime fields unmapped. I now have a GeoEvent with two empty field to which I can write calculated values (and I don't have to deal with the Field Calculator dynamically creating managed GeoEvent Definitions for me). Here's my resulting GeoEvent Service, showing the configuration of each Field Calculator. In this case, you indicated that you wanted to shift the UTC values forward three (3) hours to Kuwait localtime, so I add a three hour equivalent number of milliseconds (3 hrs x 60 min/hr x 60 sec/min x 1000 ms/sec) to each original date/time value, instructing the Field Calculators to write their values into the prepared existing fields. The output, in Esri Feature JSON format would look something like this: [
{
"attributes": {
"SensorID": "BZQT-5480-A",
"SensorValue": 53.2,
"ReportedDT": 1443017182006,
"CalibrationDate": 1435708800000,
"LocalTimeReported": 1443027982006,
"LocalTimeCalibrated": 1435719600000
}
}
] Notice that we've preserved the "ReportedDT" and "CalibrationDate" values, reported by the sensor. Combining the Field Mapper with the Field Calculators we've effectively enhanced the sensor data to include reported and calibrated date/time values - offset to Kuwait / Riyadh localtime. You can use online utilities such as EpochConverter to convert the epoch millisecond values to a human readable date/time. Hope this information helps - RJ
... View more
09-23-2015
03:11 PM
|
0
|
1
|
2846
|
|
POST
|
Hello Yuying - Thank you for sending me a sample of the JSON you are sending to GeoEvent so that I can test using your event data. [
{
"UserID": "Dave",
"DateTime": "08/13/2015 11:53:30 AM",
"Latitude": 34.05716,
"Longitude": -117.195704,
"WKID": 4326
},
{
"UserID": "Dave",
"DateTime": "08/13/2015 11:54:00 AM",
"Latitude": 34.05897,
"Longitude": -117.195423,
"WKID": 4326
}
] I took a look at the issue you reported, but was not able to reproduce it. The Create Buffer processor is working for me when incorporated into a GeoEvent Service like the one you illustrate. There was an issue at the 10.3.0 release that an invalid unit of measure "feet" was included in the 'Allowed units list' used by this processor. The correct unit of measure should be "foot". Unfortunately the invalid value was selected as the default. To correct the error you would have to select something other than the default unit. To really fix the issue you should first delete the processor which references the invalid unit of measure, then go to Site > Settings, edit the 'Allowed units list' and uncheck 'Feet' ... check 'Foot' then return to configure a new Buffer Creator processor with the valid linear unit. I'm not sure, however, that this is the issue you are encountering. The Buffer Creator in your screenshot looks like it's configured to create a 100 meter buffer. This led me to question why you were projecting the event data to NAD 1983 State Plane system for California (WKID 2230) whose linear measure is US Feet if you intended on creating a buffer measured in meters. I've attached the 10.3.0 configuration I used to test the issue. - RJ
... View more
09-23-2015
01:45 PM
|
2
|
0
|
848
|
|
POST
|
Hello Matthew - The GeoEvent Extension does not have a Collector for ArcGIS input connector per sé. The expected workflow would be to use the Collector for ArcGIS app to add and update features in a geodatabase feature class, then configure a GeoEvent 'Poll an ArcGIS Server for Features' input to poll the feature class and bring the features into GeoEvent. The feature service is the interface between GeoEvent and the features you are editing using Collector. One downside to using the Collector for ArcGIS to support device tracking is that the Collector needs to be running in the foreground. Otherwise the application does not broadcast the "breadcrumb" locations you want to use as track points. I believe this is still Collector's behavior ... I'm not aware that application was enhanced to allow location tracking while running Collector in the background. There are alternatives you might want to consider. For example, when developing the tutorial for the NMEA message adapter, I needed a widely available, simple application which would broadcast a device's location using NMEA sentence structures. I found the GPS 2 IP application by CapsicumDreams in the Apple Store. It happened to do what I needed, supporting NMEA messages. There are certainly other applications you could install which would enable you to broadcast your iPad's current location. Hope this information helps - RJ
... View more
09-16-2015
02:30 PM
|
1
|
1
|
1144
|
|
POST
|
Hello Igor - Please take a look at the following thread: Re: GeoTagging several fields from one geo fence GeoFences are intentionally lightweight by design. Typically, geometries from features in a feature service are used to model GeoFences, and when you import (or synchronize) GeoFences the only attributes brought in are a unique name for the GeoFence and a Geometry. You can see this if you log-in to the GeoEvent Administrative API and look at the GeoFences you have loaded. GeoFences are maintained in-memory, so one concern with including attribute information with every GeoFence is that system RAM requirements will increase significantly. You should think of GeoTagging as a special case of event enrichment combined with spatial filtering. GeoTagging allows an event to be enriched with the name of a GeoFence (and optionally the name of the Category to which the GeoFence belongs) when an event has a spatial relationship with the GeoFence. The GeoFence name is the only attribute available to the GeoTagger. Following the recommendation in the thread I mention above, if you follow a GeoTagger with a Field Enricher you can have your GeoEvent Service go back to the feature class from which your GeoFences were loaded and, using the GeoFence name the GeoTagger appended to the event as a field to join the event and the feature class, enrich needed attribute information from the feature originally used to model the GeoFence. I tend to refer to this pattern as a "primary enrichment" (performed by the GeoTagger) followed by a "secondary enrichment" (performed using a Field Enricher). This approach does have a limitation; GeoFences cannot overlap. When they do, the GeoTagger will enrich an event with a comma separated list of GeoFence names ... since you cannot use a comma separated list as a primary key to perform a table join, you cannot conduct the "secondary enrichment". (See threads Field Enricher with Multiple Matches and Overlapping geofences with different attributes). We are considering an enhancement to allow users to specify, when importing or synchronizing GeoFences, that attributes other than a unique name and Geometry be included in the GeoEvent's catalog of GeoFences, but the enhancement has not been assigned and will likely not be considered until after the 10.4.1 product release. Hope this information helps - RJ
... View more
09-15-2015
06:25 PM
|
4
|
0
|
1384
|
|
POST
|
Hello Sharon - What you are trying to do is definitely possible. I'm limited in my options for sending data to GeoEvent via TCP ... basically I have to use the GeoEvent Simulator ... but I can walk through what I did with you. The GeoEvent Simulator normally sends delimited text, specifically comma separated values. But that's not some hard-coded property or behavior. We could use it to send XML - we would just have to configure a GeoEvent input to expect the data to arrive via TCP and couple the TCP transport with the XML adapter so the input would know how to interpret the data. I've attached an input file (FeatureFlightsXML.csv) you can use as an example. Ignore the *.csv extension ... it's only there because the GeoEvent Simulator is looking for files named *.csv ... the file contains XML with each event on a separate line. This is necessary because the Simulator is sending data one line of text at a time, so we have to account for that. <Flight><FlightNumber>SWA2706</FlightNumber><StartTime>3/16/2012 02:25:30 PM</StartTime><OriginAirportCode>IAD</OriginAirportCode><DestinationAirportCode>TPA</DestinationAirportCode><AircraftType>B733</AircraftType><Altitude>37000</Altitude><Longitude>-79.585739</Longitude><Latitude>34.265521</Latitude></Flight>
<Flight><FlightNumber>SWA724</FlightNumber><StartTime>3/16/2012 02:25:30 PM</StartTime><OriginAirportCode>IAD</OriginAirportCode><DestinationAirportCode>ABE</DestinationAirportCode><AircraftType>SF34</AircraftType><Altitude>10000</Altitude><Longitude>-76.405289</Longitude><Latitude>39.573271</Latitude></Flight>
<Flight><FlightNumber>SWA992</FlightNumber><StartTime>3/16/2012 02:25:30 PM</StartTime><OriginAirportCode>IAD</OriginAirportCode><DestinationAirportCode>MDW</DestinationAirportCode><AircraftType>B737</AircraftType><Altitude>36400</Altitude><Longitude>-82.067414</Longitude><Latitude>39.630957</Latitude></Flight>
<Flight><FlightNumber>SWA2358</FlightNumber><StartTime>3/16/2012 02:25:30 PM</StartTime><OriginAirportCode>TPA</OriginAirportCode><DestinationAirportCode>IAD</DestinationAirportCode><AircraftType>B733</AircraftType><Altitude>37000</Altitude><Longitude>-81.20226</Longitude><Latitude>32.099263</Latitude></Flight> I've also attached a GeoEvent configuration file (Xml-to-JSON-Config.xml) which has a custom inbound connector I configured, pairing the XML adapter with the TCP transport. The configuration includes three inputs: one expects delimited text via tcp, one expects to receive XML via a REST endpoint, and the third is my custom input which expects XML via TCP. The output included in the configuration expects to be able to write generic JSON to a file it can create in the C:\GeoEvent\output folder on your local server. The GeoEvent Service XML-to-JSON feeds data from each input to the file-json-out output. Its the job of the adapters of each input to create GeoEvents from the data they receive. The output will deconstruct each GeoEvent to create generic JSON and will write that JSON to the system file. The tcp-text-in input is what I used to verify that I can use the GeoEvent Simulator to send XML. I created a GeoEvent Definition with a single field of type String. When I send an event from the simulator to TCP port 5565, the tcp-text-in input interprets the received data using the Text adapter and brings the data in as one long String. The rest-xml-in input is what I used to verify that I was sending valid XML. I configured this input to create a GeoEvent Definition for me. When I select any one line of data from the simulation file and HTTP / POST it to the rest-xml-in input's endpoint, I can verify that the received data can be interpreted using the XML adapter by seeing a GeoEvent get created and written out to the system file. The custom-xml-via-tcp input is what you're looking for. I configured this input to use TCP port 5566 (rather than the default 5565). So all I have to do is tell the GeoEvent Simulator to disconnect from the default port and reconnect on port 5566 ... then I can send the same event data one line at a time to the custom input, which will interpret the received data using the XML adapter, create GeoEvents, and write the data out to the system file. Hope this helps - RJ
... View more
09-15-2015
05:19 PM
|
0
|
1
|
1180
|
|
POST
|
Hello Adriana - I was assuming that you were using the Esri/geoevent-datastore-proxy. Is this true, or are you using the Esri/resource-proxy? The former was developed specifically for GeoEvent users who are unable to adopt the latest release of GeoEvent. Specifically it provides token management for ArcGIS Server connections registered with GeoEvent as Data Stores. Users who must remain at the 10.2.2 or 10.3.0 release who need functionality built-in to the GeoEvent 10.3.1 release can find what they need by using the GeoEvent Data Store proxy. If you're using the .NET version of the "resource-proxy" to enhance a web application, I don't think we'll be able to help; this forum is specific to the use of the ArcGIS GeoEvent Extension for Server. If you're using the GeoEvent Data Store proxy, but are not using GeoEvent, that would be an unsupported configuration. Either way, the GeoEvent product team does not have the experience you are looking for. I think you are correct that this may be a security, proxy, JavaScript API related problem. You might have better luck if you take your question to another forum. Perhaps: ArcGIS API for JavaScript, or Web AppBuilder for ArcGIS. or Web AppBuilder Custom Widgets Good Luck - RJ
... View more
09-15-2015
03:40 PM
|
0
|
0
|
1650
|
|
POST
|
Hello Gal - http://www.ims.gov.il/ims/PublicXML/observ.xml I'm not seeing any problem with the XML returned from the above URL. The GeoEvent's XML adapter is able to parse it just fine. I've attached illustrations of my 'Receive XML on a REST Endpoint' input configuration and the generated GeoEvent Definition. I chose to have the adapter look specifically for <surface_station> nodes, rather than receiving the <israel_surface_data> structure in as a single event. I used a Chrome Poster plug-in to HTTP / POST the attached XML to the GeoEvent input's REST endpoint. What I did notice was that, when using a 'Poll an External Website for XML' input, there were errors logged by the HTTP inbound transport to the effect that requests to http://www.ims.gov.il/ims/PublicXML/observ.xml returned an HTTP 406 'Not Acceptable' error. 2015-09-15 11:50:58,156 | ERROR | gov.il/ims/PublicXML/observ.xml] | HttpInboundTransport | nsport.http.HttpInboundTransport 456 | 272 - com.esri.ges.framework.transport.http-transport - 10.3.1 | http://www.ims.gov.il/ims/PublicXML/observ.xml: Request failed (HTTP/1.1 406 Not Acceptable). I'll ask to see if one of the developers can tell me why the request to your server are being returned as 'Not Acceptable', but it doesn't appear to be an issue with the XML formatting or content. - RJ
... View more
09-15-2015
12:27 PM
|
2
|
2
|
3195
|
| Title | Kudos | Posted |
|---|---|---|
| 1 | 01-05-2023 11:37 AM | |
| 1 | 02-20-2025 03:50 PM | |
| 1 | 08-31-2015 07:23 PM | |
| 1 | 05-01-2024 06:16 PM | |
| 1 | 01-05-2024 02:25 PM |
| Online Status |
Offline
|
| Date Last Visited |
yesterday
|