POST
|
Date arithmetic is awful. The best thing is to convert all date fields to ISO strings. Then comparisons and ranges are easy and if you need to do date arithmetic you can convert the ISO to a date object easily. ISO format is set up so that dates are in string sort order, the timezone can be appended (UTC = Z) or the offset added. It is easy to get a subset to extract part of the date because they are in fixed widths. Date formats often do not go before 1970 (Microsoft), 1900 (Unix) so don't use them.
... View more
06-25-2025
06:13 AM
|
0
|
1
|
140
|
POST
|
Date arithmetic is awful. The best thing is to convert all date fields to ISO strings. Then comparisons and ranges are easy and if you need to do date arithmetic you can convert the ISO to a date object easily. ISO format is set up so that dates are in string sort order, the timezone can be appended (UTC = Z) or the offset added. It is easy to get a subset to extract part of the date because they are in fixed widths. Date formats often do not go before 1970 (Microsoft), 1900 (Unix) so don't use them.
... View more
06-25-2025
05:58 AM
|
0
|
0
|
142
|
POST
|
If you just try to edit the dashboard and replace the source all settings will be lost so you have to rebuild the dashboard from scratch.
... View more
06-03-2025
07:26 AM
|
0
|
0
|
174
|
POST
|
In short you can't! Well it is not supported in the interactive interface. There is now an unsupported workaround provided by the Esri consulting team using the AGOL Assistant where you can do a search and replace for the item ID. If you know the Item IDs..... (Just do a search for AGOL Assistant and log in again). Maybe you can find the Item IDs (the new one is easy enough from the new webmap and you may have the old webmap. If you have a new feature layer then create a new web map that references the new featurelayer otherwise you will have to do two updates, one for the webmap and another for the featurelayer. I wrote a short python script to recursively look through the dashboard to find all Item IDs in the dashboard. I am intending to do the update too but haven't finished it yet. # dashboard references 2
from arcgis.gis import GIS
import json
import sys
import arcpy
import os
os.getcwd
gis = GIS(profile='econet')
# Step 2: Get the Dashboard Item
try:
dashboard_id = sys.argv[1]
except IndexError:
# dashboard_id = "0f3670488cfc4ea19f4a8e22252979fd" # RHB Trapping dashboard
dashboard_id = "eafa864b127242b29b0f5d08fc20e017" # PFK Trapping dashboard
dashboard_item = gis.content.get(dashboard_id)
if not dashboard_item:
arcpy.AddMessage("Dashboard not found. Check the dashboard ID.")
exit()
arcpy.AddMessage(f"<{dashboard_item.title}> found")
# Step 3: Access Dashboard's Configuration JSON (Data Block)
dashboard_data = dashboard_item.get_data()
if not dashboard_data:
arcpy.AddMessage("No data block found in the dashboard.")
exit()
# Step 4: Recursive Function to Extract Item IDs
def extract_item_ids(data, referenced_items):
if isinstance(data, dict):
for key, value in data.items():
if key == "itemId": # Check for itemId fields
referenced_items.append(value)
else:
extract_item_ids(value, referenced_items) # Recursively check nested dictionaries
elif isinstance(data, list):
for item in data:
extract_item_ids(item, referenced_items) # Recursively check nested lists
# Initialize a list to store referenced IDs
referenced_items = []
extract_item_ids(dashboard_data, referenced_items)
# Step 5: Fetch Referenced Item Details
unique_item_ids = set(referenced_items) # Remove duplicates
item_details = []
for item_id in unique_item_ids:
item = gis.content.get(item_id)
if item:
item_details.append({
"Item ID": item_id,
"Title": item.title,
"Type": item.type
})
# Step 6: Output Results
if item_details:
arcpy.AddMessage("Referenced Items:")
for item in item_details:
arcpy.AddMessage(f"Item ID: {item['Item ID']}, Title: {item['Title']}, Type: {item['Type']}")
else:
arcpy.AddMessage("No referenced items found in the dashboard.")
# Optional: Save to CSV
import pandas as pd
df = pd.DataFrame(item_details)
df.to_csv("referenced_items.csv", index=False)
arcpy.AddMessage("Referenced items saved to referenced_items.csv")
arcpy.AddMessage(f"{os.getcwd()}")
... View more
06-03-2025
07:20 AM
|
0
|
0
|
174
|
POST
|
Good idea, but how will it help? I really should look up some logs to see how long it took in ArcMap. Since Esri have turned off my licence for the Pro downgrade I cannot run it now. I really just want normal speed. I have got rid of any BigInteger and BigObjectID fields ( well you have to or CreateRelation() crashes). April 2024 ArcMap:
--------------------
Build relates 10:34sec (expected time!!)
Rebuild CoraxPro Relates
Process: Table To Relationship Class... lgp rel_parcel_legal
Process: Table To Relationship Class... afp rel_parcel_plan
Process: Table To Relationship Class... title_rel rel_parcel_title
Process: Table to Relationship Class... sap rel_gazette_detail
Building sap_rel...
Process: Relationship Class... sta-sap rel_parcel_gazette
Building sta_rel...
Process: Relationship Class... sta-ste rel_gazette_act
Building ste_rel...
Process: Create Relationship Class... nmi_rel rel_title_name
licence is ArcInfo
Process: Table To Relationship Class... lgp rel_parcel_legal
Process: Table To Relationship Class... afp rel_parcel_plan
Process: Table To Relationship Class... title_rel rel_parcel_title
Process: Table to Relationship Class... sap rel_gazette_detail
Process: Relationship Class... sta-sap rel_parcel_gazette
Process: Relationship Class... sta-ste rel_gazette_act
Process: Create Relationship Class... nmi_rel rel_title_name
Python error in nmi "Nominal_Index" does not exist
Well Done 0:13:58.288000
\0
Feb 2025 ArcGISPro same script upgraded for Python 3.....
========================================================
Build relates 10:34sec expected
Rebuild CoraxPro Relates
Process: Table To Relationship Class... lgp rel_parcel_legal
Process: Table To Relationship Class... afp rel_parcel_plan
ERROR: Failed to execute. Parameters are not valid.
ERROR 000800: The value is not a member of abey_prior_status | cadastral_surv_acc | certified_date | chf_sur_amnd_date | data_source | dataset_id | dataset_series | dataset_suffix | description | dlr_amnd_date | et_created | et_edited | OBJECTID | registered_date | splan | survey_class | survey_date | surveyor_data_ref | type_of_dataset | usr_id_sol.
Failed to execute (TableToRelationshipClass).
Process: Table To Relationship Class... title_rel rel_parcel_title
Process: Table to Relationship Class... sap rel_gazette_detail
Building sap_rel...
Process: Relationship Class... sta-sap rel_parcel_gazette
Building sta_rel...
Process: Relationship Class... sta-ste rel_gazette_act
Building ste_rel...
Process: Create Relationship Class... nmi_rel rel_title_name
Building nmi_rel...
licence is ArcInfo
Failed to execute. Parameters are not valid.
ERROR 000800: The value is not a member of abey_prior_status | cadastral_surv_acc | certified_date | chf_sur_amnd_date | data_source | dataset_id | dataset_series | dataset_suffix | description | dlr_amnd_date | et_created | et_edited | OBJECTID | registered_date | splan | survey_class | survey_date | surveyor_data_ref | type_of_dataset | usr_id_sol.
Failed to execute (TableToRelationshipClass).
Start Time: Friday, 23 February 2024 7:52:46 pm
Failed to execute. Parameters are not valid.
ERROR 000800: The value is not a member of et_created | et_edited | OBJECTID | original_flag | purpose | share | status | term | timeshare_week_no | ttl_title_no | type.
Failed to execute (TableToRelationshipClass).
Failed at Friday, 23 February 2024 7:52:46 pm (Elapsed Time: 0.06 seconds
==================================================================
Well Done 2:46:45.2 WHAT? and if the field widths are fixed it's 5 hours!
Note failures are due to BigIntegers that have crept in, I have now realised.
... View more
05-09-2025
02:48 AM
|
0
|
0
|
256
|
POST
|
from arcgis.gis import GIS
from arcgis.features import FeatureLayer
from datetime import datetime, timedelta
# Connect to ArcGIS Online or Enterprise
gis = GIS("https://www.arcgis.com", "your_username", "your_password") # Replace with your credentials
# Access your hosted feature layer
feature_layer_url = "https://services.arcgis.com/your_layer_url/FeatureServer/0"
layer = FeatureLayer(feature_layer_url)
# Query features, put in a filter to only update empty target fields
features = layer.query()
# Daylight savings logic for New Zealand (NZDT/NZST)
def get_nz_offset(date):
month = date.month
day = date.day
# Approximate daylight savings: NZDT starts last Sunday in September, ends first Sunday in April
if (month > 9 or (month == 9 and day >= 25)) or (month < 4 or (month == 4 and day <= 7)):
return timedelta(hours=13) # NZDT (UTC+13)
else:
return timedelta(hours=12) # NZST (UTC+12)
# Update each feature with local time conversion
updates = []
for feature in features:
utc_time = datetime.fromtimestamp(feature.attributes["timestamp_field"] / 1000) # Convert from milliseconds
local_time = utc_time + get_nz_offset(utc_time)
feature.attributes["local_timestamp_field"] = int(local_time.timestamp() * 1000) # Convert back to milliseconds
updates.append(feature)
# Apply updates to the feature layer
layer.edit_features(updates=updates)
# you can instead of putting it back into another local time field
# put the date as strings
print("Time zone conversion completed!")
... View more
05-09-2025
02:43 AM
|
0
|
1
|
517
|
POST
|
The replace layer option in the ArcGISPro share dialog needs exclusive access. It crashes if I also have the layer open in AGOL. That might also be a problem if the layer is being used in Survey123 or FieldMaps. It is also not dynamic. New records will need to be processed. I thought of a virtual field, but that might not be possible.
... View more
05-08-2025
02:18 PM
|
0
|
0
|
560
|
POST
|
A hosted featurelayer of half a million records? I assume it is on your own server, not ArcGISOnline? The storage charges are per 10MB, not per Gigabyte! Have you tried a virtual field(s) in the WebMap? Then you can use Arcade to generate the new fields dynamically.
... View more
05-08-2025
02:13 PM
|
0
|
1
|
561
|
POST
|
I have just upgraded to 3.4.0 which is no better. It has been a problem ever since 3.3.1. Maybe due to BigInteger introduction?
... View more
05-08-2025
02:07 PM
|
1
|
2
|
283
|
POST
|
Why are you using shapefiles and excel tables? You would be much better off using a proper database format with tables. Then you can validate the data and control the data schema. It will be faster and more robust. It will handle the difference between blanks and null values. You can use the Microsoft schema.ini file to properly define the schema for the excel spreadsheet before you import it to a database table. Once in a database the keys you are going to use for the table join can be indexed for speed, set to NOT NULL to make sure they are always populated and generally document the schema. The codes with dashes have to be strings, but that may need to be explicitly set.
... View more
05-07-2025
05:38 PM
|
0
|
0
|
228
|
POST
|
It anything takes longer than a cup of coffee, interrupt the process and find a better way! It is likely that it will crash with no results if it has to run for days. To update an online table from ArcGISPro it only works for a few records. If you need a bulk update the best way is to export the table to a local filegeodatabase, do the process on a local machine and then replace the online layer. The replace layer function actually does a Staging Update. This zips up the file and metadata for the layers, uploads and then unpacks in the cloud. This all takes a few seconds or at the most minutes.
... View more
05-07-2025
05:30 PM
|
1
|
2
|
586
|
POST
|
Use FME. That will work forever, they never deprecate earlier readers. It is also called Data Interop for the Esri addon if you want to pretend that it is an Esri extension.
... View more
05-07-2025
05:21 PM
|
2
|
0
|
543
|
POST
|
I have a python script that built database relates in ArcMap in a filegeodatabase. They are many-to-many relates between three tables using a relationship table in the middle. This is the design from the source in PostGIS that I want to replicate. The keys between the tables are indexed integer fields. This all worked well in ArcMap and building the relates took a few minutes. Now in ArcGISPro the same script takes many hours! What has happened? [As a side-issue, BigInteger keys cause a crash, they must be 32 bit Integers] The tables are medium sized, typically 2M records. Note that you cannot rename any tables afterwards, you have to rebuild from scratch. All tables must be in the same filegeodatabase. The relation table built is a semi-hidden table, hard to delete in a script. Has anyone else noticed this? Do you have any suggestions to restore the speed? Maybe remove the indexes before building the relates and re-creating afterwards.
... View more
05-07-2025
05:13 PM
|
1
|
4
|
354
|
POST
|
Have you considered the GroupBy function in Arcade? This seems to be very useful to group multiple names and get a count.
... View more
04-27-2025
03:49 PM
|
0
|
0
|
479
|
POST
|
I made a mistake and deleted a featurelayer used in a dashboard. I found out in time (2 weeks) so I was able to restore it. That got me thinking. How could I back up (and restore!) my precious work? There are a few problems: 1. You can export the JSON file defining the dashboard, but that is not enough. You also need the items referenced by the dashboard. That will be a WebMap mostly. But wait... the WebMap references FeatureLayers and Tables. So there needs to be a way of making a list of dependencies that also must be backed up. 2. All the references in AGOL use item_id GIUD codes that are ReadOnly. This means that simply recreating a lost reference will not work unless it has the SAME id. 3. Trying to repair a Dashboard with the replacement item is impossible in the interactive interface. As soon as you change the source, ALL settings are removed. You may as well be starting again. So there are no simple tools to package up a project into a zip file and store for restoration later. This means that the work is very fragile. No wonder there is a flag to avoid accidental deletion (which I overrode!) But I am a python programmer. Surely there is a simple arcgis function or module to make this routine? Nothing that I could find. The closest is the ArcGIS Assistant which can do a bit of a hack on the JSON defining the items. So I attempted to write my own. This is harder that it seems, even with the useless help of Copilot. I had to keep correcting it for obvious lack of understanding of the JSON structure. The first step is to get a list of item_ids referenced in the Dashboard. This is hard, you have to do a recursive search or you won't get anything, even though you know they are there somewhere. Once you get the WebMap ids you can save their JSON and move on to the FeatureLayers. The FeatureLayers return a binary ServiceDefinition which is a rabbit hole. I just want the item_id. Then there is the Restore process. We will need a list of all these separate file dumps to return to AGOL together with the groups, owners and permissions. I can see why there is a market for third party products. So this is an unsolved workflow for me. Is there something I have missed? How do other people backup/restore projects on AGOL? Do you just put it in the too-hard basket?
... View more
04-04-2025
03:16 PM
|
3
|
0
|
258
|
Title | Kudos | Posted |
---|---|---|
1 | 05-08-2025 02:07 PM | |
1 | 05-07-2025 05:13 PM | |
3 | 04-04-2025 03:16 PM | |
2 | 05-07-2025 05:21 PM | |
1 | 05-07-2025 05:30 PM |
Online Status |
Offline
|
Date Last Visited |
06-25-2025
05:57 AM
|