I've got a hosted feature class with almost half a million records. I want staff to be able to filter this very large feature class by date in Field Maps, but that's not a feature yet.
Instead, I'm going to create text fields that hold all possible values for year, month, and day so that staff can filter with these criteria.
The only problem is, my feature class is huge and I'm not sure how to manage the timezone conversion. In Pro, I can easily calculate the day, month and year from the UTC datetime, but I need to convert to Canada/Eastern (America/Toronto) first, since that's what they will see in Field Maps on their devices.
I've been mucking around for several hours with Notebooks and Field Calculator to no avail. I'd prefer a fast solution, but I'm not very experienced with Notebooks, so I keep getting bogged down.
My date-time field is named 'timestamp' and the fields I want to fill are 'year_', 'month_' and 'day_'.
Thanks for any help!
Erica
Time Zone (Environment setting)—ArcGIS Pro | Documentation
Convert Time Zone (Data Management)—ArcGIS Pro | Documentation
I don't know if this will help, but you have this under ArcGIS Pro questions, so perhaps start by having a look at controlling/converting time within Pro itself
I was trying to avoid adding a field to my layer, since it's part of an offline map, but adding a field, calculating eastern time using the convert time zone tool, then recalculating the month/day/year works. It takes a LONG time (25 minutes for 3300 points). Might have to leave my computer chugging away over the weekend...
It anything takes longer than a cup of coffee, interrupt the process and find a better way! It is likely that it will crash with no results if it has to run for days.
To update an online table from ArcGISPro it only works for a few records. If you need a bulk update the best way is to export the table to a local filegeodatabase, do the process on a local machine and then replace the online layer. The replace layer function actually does a Staging Update. This zips up the file and metadata for the layers, uploads and then unpacks in the cloud. This all takes a few seconds or at the most minutes.
This is genius, thanks! I'd likely have to do it in the following steps since these data are included in offline maps (this process has worked for me with smaller layers):
- download the data locally and process the changes
- delete the data in the online database
- append the new records from the local file
Do you think the append would take forever or would it be fairly quick? Otherwise I have to rebuild offline maps since it won't let me overwrite a layer that has offline replicas. I could also just wait until the next time I have to rebuild offline maps.
The replace layer option in the ArcGISPro share dialog needs exclusive access. It crashes if I also have the layer open in AGOL. That might also be a problem if the layer is being used in Survey123 or FieldMaps.
It is also not dynamic. New records will need to be processed. I thought of a virtual field, but that might not be possible.
A hosted featurelayer of half a million records? I assume it is on your own server, not ArcGISOnline? The storage charges are per 10MB, not per Gigabyte!
Have you tried a virtual field(s) in the WebMap? Then you can use Arcade to generate the new fields dynamically.
It's on AGOL... doesn't seem to take many credits, for whatever reason. Not too many columns, and not very complex data.
from arcgis.gis import GIS
from arcgis.features import FeatureLayer
from datetime import datetime, timedelta
# Connect to ArcGIS Online or Enterprise
gis = GIS("https://www.arcgis.com", "your_username", "your_password") # Replace with your credentials
# Access your hosted feature layer
feature_layer_url = "https://services.arcgis.com/your_layer_url/FeatureServer/0"
layer = FeatureLayer(feature_layer_url)
# Query features, put in a filter to only update empty target fields
features = layer.query()
# Daylight savings logic for New Zealand (NZDT/NZST)
def get_nz_offset(date):
month = date.month
day = date.day
# Approximate daylight savings: NZDT starts last Sunday in September, ends first Sunday in April
if (month > 9 or (month == 9 and day >= 25)) or (month < 4 or (month == 4 and day <= 7)):
return timedelta(hours=13) # NZDT (UTC+13)
else:
return timedelta(hours=12) # NZST (UTC+12)
# Update each feature with local time conversion
updates = []
for feature in features:
utc_time = datetime.fromtimestamp(feature.attributes["timestamp_field"] / 1000) # Convert from milliseconds
local_time = utc_time + get_nz_offset(utc_time)
feature.attributes["local_timestamp_field"] = int(local_time.timestamp() * 1000) # Convert back to milliseconds
updates.append(feature)
# Apply updates to the feature layer
layer.edit_features(updates=updates)
# you can instead of putting it back into another local time field
# put the date as strings
print("Time zone conversion completed!")
Can you provide a bit of context around the code (why is it a good solution, is the speed fast, etc)