I am using the Python API in a Jupyter notebook to access a feature service in AGOL with the goal of turning it into a dataframe, run some analysis, and then export it to excel. Everything works fine except the date columns keep coming in as long integers (i.e. 1519249914451). I have attempted to change the columns datetype from int64 to a datetime64[ns] but the date always comes back as 1970-01-01 00:25:19.249914451. The date should be in 2018.
Is there a different conversion that I should be using for this to get the correct date?
The date is returned as a Unix timestamp or epoch time, usually in UTC. I use something like this to convert to a locally formatted date string:
def ts(t): # timestamp_to_date
# return time.strftime('%Y-%m-%d %H:%M:%S UTC', time.gmtime(t/1000)) if t is not None else '' # UTC time
if t is not None:
ts = time.strftime('%Y-%m-%d %H:%M:%S %Z', time.localtime(t/1000)) # epoch utc to local
return ts[:21]+re.sub('[^A-Z]','',ts[21:])
else:
return ''