|
POST
|
Hello @JessicaRouns. What you describe, I think, is a limitation of the spatiotemporal data store's map service implementation. You are correct that you cannot use GeoEvent Server to edit or specify feature record symbology. Because the GeoEvent Manager web application can be used to publish map and feature services, both when using a spatiotemporal data store as well as a traditional relational geodatabase, it is reasonable to see this something the GeoEvent Server product team could address. But I would suggest that GeoEvent Server's role in this scenario is limited to writing data from processed event records out to feature records using the web service as an interface to the geodatabase. I will bring this up with developers familiar with the spatiotemporal data store's map service implementation. However, since configuring symbology for feature record display is more part of web service publication and web service specification, not GeoEvent Server, I would suggest you try asking this question on the ArcGIS Enterprise Questions board or perhaps submitting it as an enhancement request to ArcGIS Enterprise Ideas. You can mention @jill_es, one of the product managers for ArcGIS Enterprise, and she can help direct the conversation to the right team. Thanks -- RJ
... View more
07-03-2023
11:09 AM
|
2
|
0
|
887
|
|
POST
|
Hello @JessicaRouns. No, there no plans to develop an out-of-the-box inbound connector for GeoEvent Server that reads the Esri Shapefile format. Using a shapefile to capture and relay sensor observations to GeoEvent Server is not considered a good fit for real-time data. Date values in a shapefile are stored as a character string yyyy-mm-dd without a time. When using a shapefile, the time portion of a datetime value is truncated / dropped from the data value. For example, the datetime 2002-08-20 12:00:00 PM is stored in a shapefile as simply 2002-08-20. This is a severe limitation when considering the collecting and representation of sensor data presumably being updated in real-time. I would encourage you to consider ways that you might capture sensor observations as either delimited text or JSON formatted data records. You could relay delimited text to a GeoEvent Server input via a TCP socket relatively easily, or use an input capable of receiving JSON as HTTP/POST requests. Either way, you are probably going to want to use a web service to convey the data, or write a Python script to send the data to GeoEvent Server's input. You could use a system file to convey the data as either delimited text or in a JSON format, but I would recommend other transport protocols over using system files. Hope this information helps -- RJ cross-reference: - Calculate date fields (ArcGIS Pro Documentation) - Fundamentals of date fields (ArcMap 10.8 Documentation)
... View more
07-03-2023
10:08 AM
|
1
|
0
|
1190
|
|
POST
|
Previous presentations for most technical workshops are available from the https://mediaspace.esri.com site. The 2022 UC presentation for GeoEvent Server Best Practices is currently available for review. Questions and comments from either presentation are welcome on this thread.
... View more
06-30-2023
03:51 PM
|
0
|
0
|
1118
|
|
POST
|
This is a discussion thread opened for users, distributors, and especially attendees of the 2023 Esri User Conference to post questions following our technical workshop ArcGIS GeoEvent Server: Best Practices.
John Fry and I will monitor this thread throughout the summer to follow-up with answers to questions related to our technical workshop presentation. If you have a "Best Practices" type of question, please post it as a reply in this thread.
... View more
06-30-2023
03:29 PM
|
0
|
2
|
1125
|
|
IDEA
|
Update June 2023 There have been a couple of changes to GeoEvent Server which could be used to support coded value replacement as part of an event processing workflow. These changes will required that you have upgraded to at least the ArcGIS 10.9.1 release of GeoEvent Server though. The first option, if you have only one or maybe two attribute fields which contain coded values, would be to use the new Choice element. You could use Choice to switch on the coded value in each field and fan-out to follow each choice with a separate Field Mapper processor to write a descriptive string or label for a coded value into an attribute field. If you allowed the input to adapt the coded values as String then you could have the Field Mapper replace the string value '1' with the more descriptive string 'Minor Leak', for example. The drawback to this approach is that a coded value domain with a dozen (or more) discrete coded values would require a separate Field Calculator for each choice of coded value. The fan-out in your GeoEvent Service could become unwieldy, especially since you would also need a separate Choice element for each attribute field which contained a coded value. This is hardly better, in my opinion, than using a Field Enricher if you have dozens of attribute fields each with their own coded values. The approach uses a lot of brute force, and isn't very elegant, but it is more readable (perhaps) than a second approach I can suggest. The second approach would be to use a Field Mapper processor which, beginning with the 10.9.1 release, supports field name delimitation and expression evaluation. Suppose you were receiving data like the following: [ { "code1": 1, "code2": 20, "code3": -3 }, { "code1": 2, "code2": 30, "code3": -2 }, { "code1": 3, "code2": 10, "code3": -1 } ] The value { 1, 2, 3 } in the field code1 should be replaced with { 'Red', 'Green', 'Blue' } The value { 10, 20, 30 } in the field code2 should be replaced with { 'Small', 'Medium', 'Large' } The value { -1, -2, -3 } in the field code3 should be replaced with { 'Jack', 'Jill', 'Jane' } etc. An expression like the following could chain together a series of replaceAll( ) functions to handle the coded value replacement. replaceAll(replaceAll(replaceAll(code1, '1', 'Red'), '2', 'Green'), '3', 'Blue') The attribute value in the input field (code1 in this example) would have to be adapted as a String (rather than an Integer) so that the expression could overwrite one string value with another. Nesting several replaceAll( ) functions together this way requires the input string to be iteratively evaluated (and re-evaluated) which isn't very efficient. The expression itself would also become unwieldy if, for example, there were dozens of coded values in the domain. The only real advantage to this second approach is that you could configure a single Field Mapper with several different string substitution expressions in each of several different mapping fields. Each field mapping expression would take data from one field, translate it, and write the translation out to a target field. Configuring a Field Mapper with an array of string manipulation expressions might be better than having to configure a Field Enricher to handle the look-up of a descriptive string value and enrich the look-up value into an event record if only to avoid having to configure a separate Field Enricher for each attribute field containing a coded value you needed to translate (using a series of separate look-up tables). You can read more about using expressions in Field Mapper in the blog @EricIronside has here: GeoEvent 10.9: Using expressions in Field Mapper Processors
... View more
06-01-2023
11:18 AM
|
0
|
0
|
1389
|
|
BLOG
|
@BrianLomas -- Would you please open an incident with Esri Technical Support on this so an analyst can work with you to establish reproducibility? Off the cuff, I'm thinking that the repeated key 'data' is going to be a problem. I don't know that you're going to be able to specify an XML Object Name for a Poll an External Website for XML input to use to jump forward to the correct substructure in the XML and begin reading data from that point. We'll need to take a look at your GeoEvent Definition to make sure the cardinality of the different attribute keys are configured to properly interpret <data type="list"> as a single item (cardinality 1) and the nested <data type="item"> as a collection of items (cardinality many). By chance did you review the post XML Data Structures - Characteristics and Limitations? It contains some information which complements this article's discussion JSON Data Structures - Working with Hierarchy and Multicardinality
... View more
05-04-2023
07:50 PM
|
0
|
0
|
13328
|
|
POST
|
Does GeoEvent Server have a preferred string format for date and time values? When is a value for the Expected Date Format parameter required and when can I leave it unspecified? GeoEvent Server is able to adapt a variety of different string formats without relying on a custom string formatting pattern specified using the Expected Date Format parameter. The preferred format for date/time values is the ISO 8601 format with a time zone designator, but either of the following two string formats are preferable to other options: "2023-02-21T14:36:45-08:00" "Tue Feb 21 14:36:45 PST 2023" The first formatted string is an example of the ISO 8601 standard. The second is an example of the format Java uses when converting a Date to a String. Notice that both of the examples above designate the date and time values are in the Pacific Time Zone. GeoEvent Server is able to adapt either string to produce a Date without relying on a specific Expected Date Format pattern. The underlying epoch used for event record processing and the long integer value written to a geodatabase as a feature record’s Date will be the same (1677019005000) regardless of the server locale or clock setting when adapting either formatted string. GeoEvent Server inputs can adapt other string formats as Date values when a GeoEvent Definition specifies the event record attribute should be handled as a Date rather than a String. Each of the following formatted string values will also adapt successfully, without requiring an Expected Date Format pattern, but the underlying epoch will depend on the locale and clock setting of the server. "02/21/23 02:36:45 PM" "02/21/23 14:36:45" "02/21/2023 02:36:45 PM" "02/21/2023 14:36:45" None of the four date/time strings above specify a time zone, so GeoEvent Server has to assume one when adapting the formatted strings and calculating a Date. The string values will be adapted as local date/time values. The epoch long integer value assigned to the constructed Date will be a UTC date/time offset from the server’s local time by the appropriate number of hours. This means that different servers in different time zones will compute different epoch values. Consider the example below: The epoch calculated by a server observing ‘Pacific Time’ will be +03:00 hours relative to the epoch calculated by a server that observes ‘Eastern Time’ even though both server machines presumably received and adapted the same formatted string from the same data feed. If the date/time string sent by the feed included a time zone designator GeoEvent Server would not presume to use the server’s observed time zone and the epoch computed by each server would be the same. Timestamp values with only “hours” and “minutes” GeoEvent Server inputs are able to adapt string timestamps from data feeds which include only “hours” and “minutes” in the formatted string value. "02/21/23 14:36" "02/21/2023 14:36" The server’s calculated epoch will, of course, be 45 seconds earlier than the ‘Pacific Time’ epoch shown in the previous example (1677018960000 rather than 1677019005000) since the reported time does not include ‘seconds’. Specifying an Expected Date Format pattern string You only need to configure an input with an Expected Date Format pattern string if strings being received and adapted as Date values deviate from the examples above. An example frequently encountered includes date/time values on the European continent which specify the day before the month (e.g. 21/02/2023). In this case you would need to specify a date format pattern dd/MM/yyyy hh:mm:ss for GeoEvent Server to use when parsing the date string. The pattern specified uses the standard Java conventions for date/time strings. The Java SimpleDateFormat Tester is on online utility you can use to explore Java’s conventions. Example: Receiving a UTC time which does not include a time zone designator Suppose you want an input to adapt the formatted date/time string "February 28, 2023 21:36:45" reported by a sensor feed. The feed’s specification indicates that timestamps on each record are UTC values, but looking at the data you recognize that the formatted string uses the proper name for the month rather than a numeric value, includes a comma, and does not include a time zone designator. You therefore configure your GeoEvent Server input with an Expected Date Format pattern MMM dd, yyyy HH:mm:ss to instruct GeoEvent Server how the custom string format should be interpreted. The feed's specification indicates that date/time values are reported as UTC values, so you should also use the currentOffsetUTC( ) function to add your server's current UTC offset to the adapted date/time. Example: toDate(ReportedDT + currentOffsetUTC() The epoch you want to compute and assign to your geodatabase feature records is: 1677620205000. In an earlier example I indicated that GeoEvent Server assumes a date/time expressed in the ISO 8601 format is a UTC value when adapting the string. In this case GeoEvent Server assumes the opposite. The custom date/time format "February 28, 2023 21:36:45" follows no particular standard, so GeoEvent Server assumes that it represents a local value. The current UTC offset is added to the reported date/time in this case because the feed's specification said each record's timestamp is a UTC value and GeoEvent Server assumed the opposite. It checked the server's locale and applied an assumed offset based on the time zone observed by the server when adapting the formatted string as a Date. Time zones in North America are several hours behind Greenwich Mean Time (GMT) so GeoEvent Server would add some number of hours to push an assumed local value to a UTC standard value. The value returned by the currentOffsetUTC( ) function in negative (in North America) so adding the value to the date/time effectively rolls back the offset applied when the data value was adapted.
... View more
03-16-2023
03:25 PM
|
1
|
0
|
2534
|
|
POST
|
Data from my sensor feed reports timestamp values which look like the ISO 8601 format, but Date values converted to a string display with an offset hours ahead of the local time reported in the feed. Why is this happening? Sometimes the date/time strings sent from a data provider appear to follow ISO 8601 formatting rules, but they are missing a time zone designator or the literal ‘T’ between the date and time values. To answer your question we should first confirm that the raw data being received actually follows the ISO 8601 standard. For this example assume: a) The sensor feed's specification indicates that ReportedDT is sent as a local time value b) The ReportedDT string value "2023-02-10T11:45:00" is being received The date/time value being sent by the data feed does not include a time zone designation. You can apply a dynamic UTC offset based on your server’s observed time zone as part of a Field Calculator or Field Mapper expression to fix this. Allow your input to adapt the ReportedDT string value as a Date Use currentOffsetUTC( ) to offset the adapted Date from its local value to a UTC standard value Cast the result of the calculation to a Date (the arithmetic calculation returns a Long, not a Date) Example: toDate(ReportedDT - currentOffsetUTC()) The feed's value really ought to include a time zone designator to indicate in which time zone the date/time is being reported. It would be the feed’s responsibility to automatically adjust the time zone designation in data it sends when clocks change for Daylight Savings twice each year. If no offset were being applied the time zone designation should be '-00:00' to indicate the date/time is a UTC value. The data provider’s specification, in this example, supposedly says that the date and time values are consistent with a local time zone. Applications which adapt an ISO 8601 formatted string which is missing a time zone designator are free to assume a time zone. GeoEvent Server assumes the date/time is a UTC value when adapting an ISO 8601 formatted string. The current UTC offset is therefore subtracted from the reported date/time to shift the value from what the feed specification says is a local time to the UTC standard assumed by GeoEvent Server. The value returned by the currentOffsetUTC( ) function is negative for areas in North America. Subtracting a negative value effectively adds a number of hours to the Date shifting it forward to a UTC standard time. You might think to fix the ReportedDT value by adapting it as a String and then appending a time zone designator yourself to the string. If you do, twice each year when clocks are adjusted for Daylight Savings, the time zone designator you append to the received string would require update. So it is better to use the dynamic UTC offset provided by the currentOffsetUTC( ) function in this case. Remember, Date values in a geodatabase are maintained as epoch long integer values which by convention are assumed to be UTC values. As a best practice you want to write UTC values to your geodatabase and allow client applications to determine how they want to construct a string representation of an assumed UTC value they retrieve as part of a feature record Reference: https://www.w3.org/TR/NOTE-datetime
... View more
03-16-2023
02:33 PM
|
2
|
0
|
2582
|
|
POST
|
Data from my sensor feed reports the date and time of each data record in separate fields. How do I combine these into a single Date attribute? Use a Field Calculator or Field Mapper to combine the date and time values into a string which GeoEvent Server can interpret as a Date. Your goal is to construct a single ISO 8601 formatted string that contains both the date and time values which you can then cast from String to Date. For this example assume that ReportedDate and ReportedTime hold the values "02/16/2023" and "18:37:15" respectively. Assume that the date and time are reported as local values, so a dynamic UTC offset should be applied to avoid errors when clocks are adjusted for Daylight Savings twice each year. The string you are calculating for this example should be: 2023-02-16T18:37:15-00:00 Allow your input to adapt the ReportedDate and ReportedTime as separate String value Use substring( ) functions to slice values from these strings and append them to a new String Use a toDate( ) function to cast the constructed String value to a Date Use the currentOffsetUTC( ) function to offset the constructed Date from a local to a UTC standard value Write the result into a new attribute field whose data type is Date Example: The expression in the example presented above has been formatted within a code block for readability. The expression should be entered as a single line of text into a Field Calculator or Field Mapper processor. You can copy/paste the text shown below: toDate( substring( ReportedDate, 6, 10 ) + '-' + substring( ReportedDate, 0, 2 ) + '-' + substring( ReportedDate, 3, 5 ) + 'T' + substring( ReportedTime, 0, 2 ) + ':' + substring( ReportedTime, 3, 5 ) + ':' + substring( ReportedTime, 6, 8 ) + '-00:00' ) - currentOffsetUTC() Each substring( ) function in the example slices a few characters out of the expression’s target field. For example, substring(ReportedDate, 6, 10) slices the four characters '2023' from the reported date '02/16/2023'. The several + in the expression append substrings to one another. The '-' in the expression are literal dash characters. Your goal is to construct the string: 2023-02-16T18:37:15-00:00 The currentOffsetUTC( ) function uses your machine’s locale to determine a millisecond offset from local time to UTC taking current Daylight Savings adjustments into account. The example said to assume the date and time are reported as local values so we subtract the current UTC offset to shift the constructed Date to a UTC standard consistent with the string’s time zone designation (the '-00:00' we appended to the constructed ISO 8601 String). The UTC offset for regions in North America is negative as these time zones are several hours behind Greenwich Mean Time (GMT). Subtracting a negative value from the date/time effectively adds a number of hours to the value pushing it to a UTC standard time. Appending a time zone designation '-00:00' is a better approach than hard-coding an offset such as '-07:00' into the constructed string. Whatever constant you choose as the offset for a local time zone is likely only accurate during certain months of the year. Adjusting the date/time by adding the dynamic UTC offset takes your server's current Daylight Savings observation into account for you. Reference: https://www.w3.org/TR/NOTE-datetime
... View more
03-16-2023
02:14 PM
|
1
|
0
|
2597
|
|
POST
|
Data from my sensor feed reports timestamp values in seconds rather than milliseconds. When these timestamps are converted to string values they display as dates in 1970. How do I fix this? Some data feeds report date/time values as 10-digit epoch values measured in seconds. GeoEvent Server uses 13-digit millisecond values consistent with the ArcGIS REST API, so you have to multiply by 1000 to scale the value from seconds to milliseconds before writing it out to a geodatabase To prevent an epoch such as 1676597835 from displaying as "Tue Jan 20 02:43:17 MST 1970": Allow your input to adapt the 10-digit epoch value as either a Date or a Long integer value Use a Field Calculator or Field Mapper to scale the ReportedDT attribute value from seconds to milliseconds Use a toDate( ) function to explicitly cast the arithmetic result from an implicit Long to a Date Example: toDate(ReportedDT * 1000)
... View more
03-16-2023
01:34 PM
|
1
|
0
|
2654
|
|
POST
|
To answer your question we should first take a look at the raw data GeoEvent Server is receiving. The date/time values associated with sensor data are highly adaptable and can be represented many different ways. Your question mentions receiving data from Verizon Connect, so let’s look at some sample data from that sort of data feed. The illustration above left shows a sample of raw JSON data typical of what Verizon Connect might send. Note the UpdateUTC data value is sent as a 13-digit epoch value in milliseconds. When a GeoEvent Server input receives data in this format and adapts it as a Date the epoch value is assumed to represent a Coordinated Universal Time (UTC) value, not a local time value such as Mountain Standard Time (MST). So to clarify a point in your question, data is not received as MST values. It is received as an epoch long integer value and adapted, first for processing as a Date data type, and later for display as a string. The second illustration, above right, shows an online date/time utility I often use to convert epoch long integer values to human-readable string representations of a given date and time. Note that the strings constructed by the utility specify an offset from Greenwich Mean Time (GMT). GMT and UTC never change for Daylight Saving Time (DST) and are sometimes used interchangeably even though GMT is technically a time zone and UTC is a time standard. A display string constructed from the epoch 1676597835000 can specify a time zone. For example, an application might construct a string to display a date and time as February 17th 1:37:15 AM (GMT) or display a local time February 16th 6:37:15 PM (MST). Note that when the epoch converter displays a local time value a time zone offset is included in the string. Client applications generally determine how they want to construct and display string values representing a date and time. GeoEvent Server uses epoch long integer values for its Date values when processing data, and at times converts the Date into a human-readable string for display in the server’s local time zone. To your original question, the UpdateUTC data value is not changed when written to the database. The Date value in a geodatabase feature record is always going to be an epoch long integer value consistent with the ArcGIS REST API. Verizon Connect sends the date/time of a vehicle position report as a UTC value, and that is how the value is being stored in the database. Why a date/time value displayed by the GeoEvent Sampler appears different, for example, than the same value written out to a CSV file or a JSON file is a question of data adaptation and string construction. Rather than displaying the actual epoch long integer value of a Date (such as 1676597835000) GeoEvent Sampler constructs and displays a string representation of the Date using your server’s locale to determine an appropriate time zone. That is why you see the time zone MST included in the GeoEvent Sampler’s displayed string "Thu Feb 16 18:37:15 MST 2023". When you configure a Write to a CSV File output you choose whether date/time values are written out as ISO 8601 formatted values or as strings in a custom format. The Write to a JSON File output, on the other hand, cannot be configured to write out a human-readable string. When reviewing the JSON output file in a text editor you will see the actual epoch long integer value of a Date. I’ll include some examples in the comments below to illustrate how you might expect different client applications to construct display strings from epoch long integer values stored in a geodatabase feature record. But to answer the last part of your question, how do you fix what you are seeing, that depends entirely on what you are trying to do. The epoch long integer underlying a Date cannot express a time zone or an offset from UTC the way a String value can. You could configure a Field Calculator or Field Mapper with an expression that either adds or subtracts a number of hours from a Date value, but I really do not recommend this. If you fudge a Date value to shift it from UTC into a local time zone you risk a client application downstream, possibly one outside the ArcGIS Enterprise, constructing a string from an epoch value it assumes is a UTC value and displaying a string which appears incorrect. Suppose, for example, you are using an application like SQL Server Management Studio (SSMS). When the application retrieves a Date whose value you have explicitly offset, it will likely assume the value it retrieved is a UTC value, use your server’s locale to determine an appropriate time zone, and apply the temporal offset a second time. The Date value you explicitly offset by a number of hours as part of your event record processing in GeoEvent Server now appears incorrect when viewed using SSMS. For the simple reason that a database server can be in any time zone, and web client applications that access the data may not be in the same time zone as the server, the recommended best practice is to keep the ArcGIS REST API default and allow feature services to maintain Date values as epoch long integer values in the assumed UTC standard. If you want to calculate and store a local representation of a Date value, calculate the value as a String. You can use the toString( ) function in a Field Mapper, for example, to do this. A String value is a literal string and won’t ever be manipulated to change its value. This might be ideal for displaying attribute values in a web map pop-up, but you cannot use attributes of type String to configure something like the time slider in ArcGIS Pro. Instead of computing a String value, or adding/subtracting a number of milliseconds from a Date value, the best approach working with ArcGIS Pro would be to configure its feature layer properties to apply a time zone offset to Date values it retrieves from a geodatabase. As you zoom, pan, and potentially change a map’s temporal extent to see more or fewer features, the date and time strings displayed by ArcGIS Pro should reflect local time rather than a UTC time.
... View more
03-16-2023
01:30 PM
|
1
|
4
|
2656
|
|
POST
|
This post was developed by Esri staff with the purpose of modeling real customer questions and positioning them with answers in a way that complements our users’ search processes. We use Verizon Connect to track vehicles in our van pool in Cheyenne Wyoming. The clocks on our vehicles are set to Mountain Standard Time (MST) which is -7 hours relative to UTC. When I use the GeoEvent Sampler to look at data from my input I see the date and time in MST, which is what I want. The data also appears to be in MST when I look at output I’ve written to a CSV file. Here are a couple of screenshots of what I’m seeing: When I look at feature records in the database the date and time look like they’ve been switched to an integer. When I open the feature record’s attribute table in ArcGIS Pro all of the values are offset 7 hours. So, data coming into GeoEvent Server is MST but is somehow being changed when written to the database to switch it to UTC? Why is this happening and how do I fix it?
... View more
03-16-2023
01:13 PM
|
0
|
5
|
2669
|
|
BLOG
|
ArcGIS 11.1 Updates to the Field Enricher (Feature Service) processor include the ability to select units for the cache refresh time and set the Refresh Rate to as frequent as every 10 seconds. The above cache settings should be used with caution as configuring the processor to frequently query the feature service containing the feature records being used for enrichment can cripple the number of event records you can process through a GeoEvent Service every second. The processor cannot be configured to refresh its cache any faster than once every 10 seconds. Validation is conducted when the GeoEvent Service is published. As @EricIronside mentions above, the cache refresh can be disabled by setting the Cache Refresh Time Interval parameter to 0. This effectively means that when the processor observes an event record with a given attribute join identifier (usually the TRACK_ID), a query to retrieve a single feature record from the feature service will be made and the enrichment data loaded into the processor's cache for that asset record's identifier. The cached value will not be updated as it is set to never expire. Publishing changes to the GeoEvent Service will create a new instance of the processor, whose cache is empty. Stopping and restarting the GeoEvent Server service(s) will also obliterate a Field Enricher (Feature Service) processor instance's cache.
... View more
03-14-2023
04:50 PM
|
1
|
0
|
413
|
|
POST
|
@wizgis -- It's not a problem that the date and time are part of the same data value being ingested. We actually prefer that. The GeoEvent Definition specifies that the input should be adapting data values as a Date. The fact that acquisitionTime is expressed as an ISO 8601 standard value helps guarantee GeoEvent Server's input will be able to adapt the value it receives and create a Date. I'm not sure that you can configure a Filter literally with $RECEIVED_TIME, but you should be able to configure it with the name of the attribute field (e.g. acquisitionTime) as you show in your illustrations. That is how I tested the configuration I included in my previous response as a screenshot. I did perceive what looks like a typo in the one screenshot you shared: I'm sure you took care of that extra bit of punctuation, but I saw it in your screenshot, so I thought I'd mention it. One thing to check is the actual value of the Date your GeoEvent Server adapts from the ISO 8601 value received by the input. Adaption of an ISO 8601 value assumes UTC. Given the input and output below: You'll note that the value 1669849214000 can be represented as either of the following strings: o Wed Nov 30 15:00:14 PST 2022 o Wed Nov 30 23:00:14 GMT 2022 A string representation of the epoch value 1669849214000 must be constructed in order to extract values for hh, mm, and ss to plug into the filter's algebra. The filter, I think, is choosing to construct the string in the server's local time zone -- which for me is the first string above, showing PST. Obviously if you are expecting the filter to only keep values between 05:00 and 17:00 and the filter is using strings constructed in local time, then 1669849214000 and 1669852814000 both satisfy the filter. These two epoch values are 15:00:14 and 16:00:14 (local, PST). The values 1669856414000 and 1669860014000 are both greater than 17:00 once the epochs are converted to strings in the local time zone. The on-line utility https://www.epochconverter.com can help you convert epoch values to string values in both GMT and your local time zone. At this point if you are still having trouble configuring the filter and working with the date/time value conversions, I think it would be helpful for you to open an incident with Esri Technical Support so that someone can work with you, your data, and correlate behavior for your time zone. Hope the above helps -- RJ
... View more
03-06-2023
06:12 PM
|
0
|
0
|
1528
|
|
POST
|
------------------------------------------------- Release: GeoEvent Server 11.1 Processor: GeoTagger ------------------------------------------------- What's wrong with the GeoEvent Service below? Placing any Filter, but especially a spatial filter configured with an Enter or Exit condition, in front of a processor configured to determine Enter or Exit can produce unexpected results if the filter discards an event record the processor needs in order to correctly evaluate its conditional. Suppose we have two geofences, Hotel_028 and Indigo_045 as illustrated below. Two locations of a trackable asset, the point locations #1 and #2 on the map, are received and processed through the GeoEvent Service illustrated above. The intent is to capture information about when the vehicle enters and exits either of the geofences. The vehicle's TRACK_ID has never before been reported, so the filters and processors are observing the vehicle's locations for the first time. Here's the problem: Given that the vehicle's TRACK_ID has never before been observed the default behavior for geofence entry "First GeoEvent triggers Enter" applies. A Filter or Processor evaluating an Enter condition will recognize that the first reported position inside a geofence qualifies as an entry. The upper event processing branch will log the entry of track point #1 as expected. The lower event processing branch will not log an exit because the default for "First GeoEvent triggers Exit" is False. With no prior observations, track point #1 does not satisfy the Exited filter's conditional that a previous event record with the same TRACK_ID was observed "inside" some other geofence, so the filter does not see the event record as exiting an area of interest and discards the event record. When the second track point #2 is received, the Exited filter, having previously observed the prior track point inside the geofence Hotel_028, recognizes the exiting condition and passes the event record along. But the processor following the filter has not seen or evaluated the first track point. With no prior observations, track point #2 does not satisfy the GeoTag OnExit GeoTagger processor's conditional, so the processor does not write the name of a geofence from which the track point has exited into the event record it processes. The fix for this is to rework the event processing on the lower branch to move the filtering operation after the GeoTagger. Illustrated above I have placed two mutually exclusive attribute filters after the GeoTagger processor. The first filter catches the condition when the processor does not recognize an exiting condition and therefore does not write the name of a geofence into an attribute field named GeoTag. Presumably I want to craft some specific message in this case, so I route the event record through a Field Mapper which places a simple message (as a String) into a field to indicate that the current event record has not exited an area of interest. The second filter catches the condition when the processor does recognize an exiting condition, because the GeoTag attribute field holds the name of a geofence (its value is not null). Again, I want to craft some specific message for this case, so I route the event record through a Field Mapper to place a simple message into a field to indicate the date/time the current event record exited an area of interest and the name of the area exited.
... View more
03-03-2023
05:51 PM
|
0
|
0
|
1285
|
| Title | Kudos | Posted |
|---|---|---|
| 1 | 01-05-2023 11:37 AM | |
| 1 | 02-20-2025 03:50 PM | |
| 1 | 08-31-2015 07:23 PM | |
| 1 | 05-01-2024 06:16 PM | |
| 1 | 01-05-2024 02:25 PM |
| Online Status |
Offline
|
| Date Last Visited |
yesterday
|