Luis,
Are you using the $ReceivedTime from the input for the dates streaming to your outputs? That date would be Universal Time. What time zone are you in? Maybe the TCP output is converting to your local time zone?
DG
Hello RJ Sunderman,
The GEP Field Calculator seems to be a Good Workaround. However, in our case, the concerned Feature Class has 2 Date Fields: CREATEDDATE and MODIFIEDDATE
Can I use only 1 GEP Field Calculator Processor to include the Offset mechanism for both the Existing Date Fields?
- If yes, how should be the expression? (Give me an example like this: CREATEDDATE + 10800000 , MODIFIEDDATE + 10800000 [NOTE: We are in Kuwait, which is GMT + 3] )
- If No, how should I proceed?
Thanks,
Richardson
Hello Richardson -
If you have two fields in your GeoEvent that you want to offset from UTC to Localtime, you will need to use two Field Calculator processors.
For example, consider the following input (generic JSON which can be sent to GeoEvent via HTTP / POST):
[ { "SensorID": "BZQT-5480-A", "SensorValue": 53.2, "ReportedDT": "2015-09-23 14:06:22.6 UTC", "CalibrationDate": "2015-07-01 00:00:00.0 UTC" } ]
I configured a 'Receive JSON on a REST Endpoint' input with an 'Expected Date Format' property value: yyyy-MM-dd HH:mm:ss.S z
The input is now configured to handle the data provider's specific string representation of a date/time, expecting the string value to include the time zone specification (in this case, UTC).
If I wanted a client to display date/times as localtime, and the client wasn't configured (or able) to offset the UTC values for me, I would want to create two additional event fields - one to hold the "Reported Date" in localtime and one to hold the "CalibrationDate" in localtime. I suggest this because deliberately falsifying the actual UTC date/time values by overwriting them with computed offsets is bad practice.
So, I copy the GeoEvent Definition used by the input to create a new event definition, and add the needed fields to the new event definition. Then I use a Field Mapper to map the received data into the new schema, leaving the two localtime fields unmapped. I now have a GeoEvent with two empty field to which I can write calculated values (and I don't have to deal with the Field Calculator dynamically creating managed GeoEvent Definitions for me).
Here's my resulting GeoEvent Service, showing the configuration of each Field Calculator. In this case, you indicated that you wanted to shift the UTC values forward three (3) hours to Kuwait localtime, so I add a three hour equivalent number of milliseconds (3 hrs x 60 min/hr x 60 sec/min x 1000 ms/sec) to each original date/time value, instructing the Field Calculators to write their values into the prepared existing fields.
The output, in Esri Feature JSON format would look something like this:
[ { "attributes": { "SensorID": "BZQT-5480-A", "SensorValue": 53.2, "ReportedDT": 1443017182006, "CalibrationDate": 1435708800000, "LocalTimeReported": 1443027982006, "LocalTimeCalibrated": 1435719600000 } } ]
Notice that we've preserved the "ReportedDT" and "CalibrationDate" values, reported by the sensor. Combining the Field Mapper with the Field Calculators we've effectively enhanced the sensor data to include reported and calibrated date/time values - offset to Kuwait / Riyadh localtime. You can use online utilities such as EpochConverter to convert the epoch millisecond values to a human readable date/time.
Hope this information helps -
RJ
Hello RJ,
Thanks a lot. This worked for me. In my case, the Input date and time is in Local Time, so I am not using the Field Mapper. I just used 2 Field Calculators and I am getting the desired output which we needed.
Enjoy your day.
Regards,
Richardson