|
POST
|
You still have unlimited access to the python geometry engine. So you can re-program every tool that you need using arcpy. You will have more low level work, processes work on pairs of geometry objects so you will need to interate over the featureclass. Often you can program in a shortcut that makes it faster than the general tool. If you only need a few restricted tools this may be worth it to keep using the Basic licence. I agree that not being able to create more complex featureclasses is a serious drawback. https://pro.arcgis.com/en/pro-app/latest/arcpy/classes/geometry.htm This often happens to me when a customer wants an app that works only on Basic. If there is a tool not available I create my own. eg I wanted to Densify a polygon from four straight lines to many vertices so that when it was projected the lines were curves. Oops, Standard or Advanced only. But I only have one polygon to edit and that can be done easily by adding vertices in a script. You can move selections easily between a featureclass and an object list. Much can be done in memory or with python lists. There are some tools that disappeared in ArcMap that I have rebuilt. eg Frequency with Case. I often want to use a composite key but relates only work with a single field. Just run my tool that creates a new case# that is a single number for the composite key. Many others that I used to have are now in Pro such as Convex Hull. http://www.ollivier.co.nz/support/python_tips/index.htm Look at the sample Furthest Town task for an example of how to replace Near().
... View more
03-13-2026
08:30 PM
|
1
|
0
|
190
|
|
BLOG
|
I have a solution where the field is redefined in the geopackage before uploading. A second version allows for non UTC datetime. import pandas as pd
def normalise_date_fields(gdf, use_utc: bool = True, local_tz=None, verbose: bool = True):
"""
Convert epoch-based date fields to timezone-aware datetimes.
Targets columns with 'date' or 'time' in the name.
Parameters
----------
gdf : GeoDataFrame
Input GeoDataFrame.
use_utc : bool, default True
If True, output datetimes are in UTC (tz-aware, +00:00).
If False, output datetimes are converted to local timezone (tz-aware with offset).
local_tz : str | tzinfo | None
Timezone to use when use_utc=False.
- None => use the machine's local timezone.
- str => e.g. "Pacific/Auckland"
verbose : bool
Print conversion messages.
Returns
-------
GeoDataFrame
GeoDataFrame with converted datetime columns where applicable.
"""
# Resolve local timezone (only used when use_utc=False)
if not use_utc:
if local_tz is None:
# Use the runtime environment's local timezone
local_tz = pd.Timestamp.now().tz_localize("UTC").tz_convert(None).tzinfo
# Note: tz_convert(None) returns naive; tzinfo can still be recovered in many envs,
# but if your environment behaves oddly, pass local_tz="Pacific/Auckland".
elif isinstance(local_tz, str):
# Pandas can use IANA tz database strings
local_tz = local_tz
# else assume tzinfo-like object is passed through
for col in gdf.columns:
col_lc = col.lower()
if "date" in col_lc or "time" in col_lc:
series = gdf[col]
# Only attempt conversion on numeric columns
if pd.api.types.is_numeric_dtype(series):
max_val = series.dropna().max()
# Heuristic: epoch milliseconds are usually > 1e12
if pd.notna(max_val) and max_val > 1e12:
try:
# Step 1: interpret epoch(ms) as UTC
dt_utc = pd.to_datetime(series, unit="ms", utc=True, errors="coerce")
# Step 2: optionally convert to local timezone (keeping offset)
if use_utc:
gdf[col] = dt_utc
if verbose:
print(f" ✓ Converted epoch(ms) → datetime (UTC): {col}")
else:
gdf[col] = dt_utc.dt.tz_convert(local_tz)
if verbose:
print(f" ✓ Converted epoch(ms) → datetime (local offset): {col} ({local_tz})")
except Exception as e:
if verbose:
print(f" ⚠ Failed to convert {col}: {e}")
return gdf
# -------- main -----------
# other processing...
gdf = gpd.GeoDataFrame.from_features(features)
gdf = gdf.set_crs("EPSG:4326")
gdf = normalise_date_fields(gdf)
gdf.to_file(GPKG_PATH, layer=layer, driver="GPKG")
print(f" Saved: {layer} ({len(gdf)} features)")
... View more
03-13-2026
05:17 PM
|
1
|
0
|
145
|
|
POST
|
Reinstalling is not the solution. It is to do with configuration in your <user>\appdata\local|roaming\Esri\ArcGISPro folders. Try renaming them to force Pro to rebuild them. Other tools may hang Pro such as partly failed web connections, graphics card incompatibility and a host of other settings not cleared with a reinstall.
... View more
03-13-2026
05:09 PM
|
0
|
0
|
698
|
|
BLOG
|
I am having trouble with date fields in a geopackage. The best format is an ISO string because it can include the timezone. But it can be an epoch number. Sqlite handles all dates as functions, not a data type. This means that AGOL cannot read a date in a geopackage. How does Survey123 handle dates in a geopackage?
... View more
03-12-2026
05:14 PM
|
1
|
0
|
159
|
|
POST
|
YES, but with Pro 3.5.3. I have tried a lot of fixes but to no avail. Copilot blamed the GPU, and all sorts of things. The easy solution is to rename the folders <user>Appdata\Local\Esri\ArcGISPro and <user>Appdata\roaming\Esri\ArcGISPro and reboot. You lose all your recent list etc but its a small price to pay. There are a lot of other things that may be hanging and remain open that freezes everything including Exit. If you have to use TaskManager its Esri's fault, not you or your data. Don't wait for things to close. If anything takes longer than a cup of coffee, interrupt the process and find a fix or find a better way.
... View more
03-12-2026
01:10 AM
|
1
|
0
|
748
|
|
POST
|
You might try my script. It extracts the data from WFS, loads it into a geopackage, uploads the geopackage, zipped up. Then turn the online geopackage into a published feature layer collection. Get out the item_ids and put them in your script to update instead of add. It is fast and reliable, only uses open source tools arcgis and a AGOL licence. Not a notebook. Now all I need to solve is how to format dates so that AGOL can recognise them as dates.
... View more
03-12-2026
12:58 AM
|
0
|
0
|
323
|
|
POST
|
I never completed the script because Esri fixed the bug to allow adding a WFS layer directly in AGOL. Since then Esri have made a lot of changes to WFS connections. They do not like them because it is not the preferred REST interface. For a limited number of records it works well, but over a few thousand it might have a performance hit. You can filter the records in the WFS URL but that becomes fiddly and specific. I prefer the 'dynamic' interface that avoids having to run a python Docker container to refresh a featurelayer at intervals. I can see a lot of problems trying to maintain a multiuser live dataset behind the scenes. The result is a fragile interface where sometimes the update is not successful. I also had to solve importing more modules into Docker that were not in the default. This is feeling very hacky! WFS is a batch process so it is not a live update, the full table is cached somewhere. It only gets a snapshot, the on demand button does nothing. Since my source is only updated nightly it did not need a live feed. The disadvantage of a direct WFS layer is that Avenue is disabled and everything is read-only. This results in a WISIWIG display that depends on how well the source server structures and presents the data. But you can still do most filtering, labelling and symbology suitable to use in a Dashboard which is my sole purpose of using the WFS layer. I had thought that creating a WFS feature layer and then using it in a Web Map would fool AGOL but it doesn't, everything is read-only, not even a virtual field is permitted. My original workflow use ArcGISPro to do the WFS feed and then overwrite the AGOL featurelayer. This menu item has now been disabled in the latest update 3.5 update. Copilot suggests this is because keeping the connection open causes problems closing Pro so it had to go. I can still open a WFS feed and export it to a geodatabase. The obvious next step was to automate this with arcpy and arcgis. That is hard to make work because there is no easy way to export a dataframe to a feature layer. The extension for geoframes does not support WFS so I have to use pandas and that is not compatible. Also it required Pro to run. My thoughts turned to using an AGOL notebook. There was a conference technical presentation on how to do it, but I got the impression that the speaker did not recommend it. I have managed to make a semi-dynamic Dashboard that does not require any update process to get the data as entered into a separate app called Trap.nz that provides a WFS interface with some non-standard format. Pest Activities In conclusion I have given up any customization in the interests of simplicity. I managed to do most filtering and dynamic displays of the activity in three Dashboards. Any deep analysis is best done on the desktop. The result is a completely no-code solution, although it still requires a lot of expertise and a manual to set up. But here is my script anyway. Notebooks are not a valid file type, take off the .txt suffix after downloading. A second script I have written that does not use arcpy and Pro, only arcgis module and python with a geopackage. You have to upload a geopackage and then go into AGOL and create a published feature layer from the online geopackage item. Then the next time you run the script fill in the item_ids and do an update so that the item_ids are retained for the webmap and dashboard. All good except: geopackages do not have a date field. The dates are not recognised by the dashboard. How can this be fixed?
... View more
03-11-2026
05:58 PM
|
1
|
0
|
332
|
|
POST
|
I have spend a week off and on trying to get VSCode to work as it used to. I found this gem from Copilot. Presumably other people know about this. I don't like the solutions proposed much. ArcGIS Pro’s integration with VSCode is well-intentioned
- but a bit heavy-handed.
When you launch VSCode from within ArcGIS Pro,
it tries to “help” by injecting its environment into your
global settings.json , but it sets:
the path to the env folder without python.exe on the end
—which is a folder, not an executable.
VSCode then tries to run that as if it were python.exe, and boom: you get the conda.exe error because it’s misinterpreting the path.
... View more
08-26-2025
03:48 PM
|
1
|
0
|
2076
|
|
POST
|
Date arithmetic is awful. The best thing is to convert all date fields to ISO strings. Then comparisons and ranges are easy and if you need to do date arithmetic you can convert the ISO to a date object easily. ISO format is set up so that dates are in string sort order, the timezone can be appended (UTC = Z) or the offset added. It is easy to get a subset to extract part of the date because they are in fixed widths. Date formats often do not go before 1970 (Microsoft), 1900 (Unix) so don't use them.
... View more
06-25-2025
06:13 AM
|
0
|
1
|
590
|
|
POST
|
Date arithmetic is awful. The best thing is to convert all date fields to ISO strings. Then comparisons and ranges are easy and if you need to do date arithmetic you can convert the ISO to a date object easily. ISO format is set up so that dates are in string sort order, the timezone can be appended (UTC = Z) or the offset added. It is easy to get a subset to extract part of the date because they are in fixed widths. Date formats often do not go before 1970 (Microsoft), 1900 (Unix) so don't use them.
... View more
06-25-2025
05:58 AM
|
0
|
0
|
592
|
|
POST
|
If you just try to edit the dashboard and replace the source all settings will be lost so you have to rebuild the dashboard from scratch.
... View more
06-03-2025
07:26 AM
|
0
|
0
|
815
|
|
POST
|
In short you can't! Well it is not supported in the interactive interface. There is now an unsupported workaround provided by the Esri consulting team using the AGOL Assistant where you can do a search and replace for the item ID. If you know the Item IDs..... (Just do a search for AGOL Assistant and log in again). Maybe you can find the Item IDs (the new one is easy enough from the new webmap and you may have the old webmap. If you have a new feature layer then create a new web map that references the new featurelayer otherwise you will have to do two updates, one for the webmap and another for the featurelayer. I wrote a short python script to recursively look through the dashboard to find all Item IDs in the dashboard. I am intending to do the update too but haven't finished it yet. # dashboard references 2
from arcgis.gis import GIS
import json
import sys
import arcpy
import os
os.getcwd
gis = GIS(profile='econet')
# Step 2: Get the Dashboard Item
try:
dashboard_id = sys.argv[1]
except IndexError:
# dashboard_id = "0f3670488cfc4ea19f4a8e22252979fd" # RHB Trapping dashboard
dashboard_id = "eafa864b127242b29b0f5d08fc20e017" # PFK Trapping dashboard
dashboard_item = gis.content.get(dashboard_id)
if not dashboard_item:
arcpy.AddMessage("Dashboard not found. Check the dashboard ID.")
exit()
arcpy.AddMessage(f"<{dashboard_item.title}> found")
# Step 3: Access Dashboard's Configuration JSON (Data Block)
dashboard_data = dashboard_item.get_data()
if not dashboard_data:
arcpy.AddMessage("No data block found in the dashboard.")
exit()
# Step 4: Recursive Function to Extract Item IDs
def extract_item_ids(data, referenced_items):
if isinstance(data, dict):
for key, value in data.items():
if key == "itemId": # Check for itemId fields
referenced_items.append(value)
else:
extract_item_ids(value, referenced_items) # Recursively check nested dictionaries
elif isinstance(data, list):
for item in data:
extract_item_ids(item, referenced_items) # Recursively check nested lists
# Initialize a list to store referenced IDs
referenced_items = []
extract_item_ids(dashboard_data, referenced_items)
# Step 5: Fetch Referenced Item Details
unique_item_ids = set(referenced_items) # Remove duplicates
item_details = []
for item_id in unique_item_ids:
item = gis.content.get(item_id)
if item:
item_details.append({
"Item ID": item_id,
"Title": item.title,
"Type": item.type
})
# Step 6: Output Results
if item_details:
arcpy.AddMessage("Referenced Items:")
for item in item_details:
arcpy.AddMessage(f"Item ID: {item['Item ID']}, Title: {item['Title']}, Type: {item['Type']}")
else:
arcpy.AddMessage("No referenced items found in the dashboard.")
# Optional: Save to CSV
import pandas as pd
df = pd.DataFrame(item_details)
df.to_csv("referenced_items.csv", index=False)
arcpy.AddMessage("Referenced items saved to referenced_items.csv")
arcpy.AddMessage(f"{os.getcwd()}")
... View more
06-03-2025
07:20 AM
|
0
|
0
|
815
|
|
POST
|
Good idea, but how will it help? I really should look up some logs to see how long it took in ArcMap. Since Esri have turned off my licence for the Pro downgrade I cannot run it now. I really just want normal speed. I have got rid of any BigInteger and BigObjectID fields ( well you have to or CreateRelation() crashes). April 2024 ArcMap:
--------------------
Build relates 10:34sec (expected time!!)
Rebuild CoraxPro Relates
Process: Table To Relationship Class... lgp rel_parcel_legal
Process: Table To Relationship Class... afp rel_parcel_plan
Process: Table To Relationship Class... title_rel rel_parcel_title
Process: Table to Relationship Class... sap rel_gazette_detail
Building sap_rel...
Process: Relationship Class... sta-sap rel_parcel_gazette
Building sta_rel...
Process: Relationship Class... sta-ste rel_gazette_act
Building ste_rel...
Process: Create Relationship Class... nmi_rel rel_title_name
licence is ArcInfo
Process: Table To Relationship Class... lgp rel_parcel_legal
Process: Table To Relationship Class... afp rel_parcel_plan
Process: Table To Relationship Class... title_rel rel_parcel_title
Process: Table to Relationship Class... sap rel_gazette_detail
Process: Relationship Class... sta-sap rel_parcel_gazette
Process: Relationship Class... sta-ste rel_gazette_act
Process: Create Relationship Class... nmi_rel rel_title_name
Python error in nmi "Nominal_Index" does not exist
Well Done 0:13:58.288000
\0
Feb 2025 ArcGISPro same script upgraded for Python 3.....
========================================================
Build relates 10:34sec expected
Rebuild CoraxPro Relates
Process: Table To Relationship Class... lgp rel_parcel_legal
Process: Table To Relationship Class... afp rel_parcel_plan
ERROR: Failed to execute. Parameters are not valid.
ERROR 000800: The value is not a member of abey_prior_status | cadastral_surv_acc | certified_date | chf_sur_amnd_date | data_source | dataset_id | dataset_series | dataset_suffix | description | dlr_amnd_date | et_created | et_edited | OBJECTID | registered_date | splan | survey_class | survey_date | surveyor_data_ref | type_of_dataset | usr_id_sol.
Failed to execute (TableToRelationshipClass).
Process: Table To Relationship Class... title_rel rel_parcel_title
Process: Table to Relationship Class... sap rel_gazette_detail
Building sap_rel...
Process: Relationship Class... sta-sap rel_parcel_gazette
Building sta_rel...
Process: Relationship Class... sta-ste rel_gazette_act
Building ste_rel...
Process: Create Relationship Class... nmi_rel rel_title_name
Building nmi_rel...
licence is ArcInfo
Failed to execute. Parameters are not valid.
ERROR 000800: The value is not a member of abey_prior_status | cadastral_surv_acc | certified_date | chf_sur_amnd_date | data_source | dataset_id | dataset_series | dataset_suffix | description | dlr_amnd_date | et_created | et_edited | OBJECTID | registered_date | splan | survey_class | survey_date | surveyor_data_ref | type_of_dataset | usr_id_sol.
Failed to execute (TableToRelationshipClass).
Start Time: Friday, 23 February 2024 7:52:46 pm
Failed to execute. Parameters are not valid.
ERROR 000800: The value is not a member of et_created | et_edited | OBJECTID | original_flag | purpose | share | status | term | timeshare_week_no | ttl_title_no | type.
Failed to execute (TableToRelationshipClass).
Failed at Friday, 23 February 2024 7:52:46 pm (Elapsed Time: 0.06 seconds
==================================================================
Well Done 2:46:45.2 WHAT? and if the field widths are fixed it's 5 hours!
Note failures are due to BigIntegers that have crept in, I have now realised.
... View more
05-09-2025
02:48 AM
|
0
|
0
|
997
|
|
POST
|
from arcgis.gis import GIS
from arcgis.features import FeatureLayer
from datetime import datetime, timedelta
# Connect to ArcGIS Online or Enterprise
gis = GIS("https://www.arcgis.com", "your_username", "your_password") # Replace with your credentials
# Access your hosted feature layer
feature_layer_url = "https://services.arcgis.com/your_layer_url/FeatureServer/0"
layer = FeatureLayer(feature_layer_url)
# Query features, put in a filter to only update empty target fields
features = layer.query()
# Daylight savings logic for New Zealand (NZDT/NZST)
def get_nz_offset(date):
month = date.month
day = date.day
# Approximate daylight savings: NZDT starts last Sunday in September, ends first Sunday in April
if (month > 9 or (month == 9 and day >= 25)) or (month < 4 or (month == 4 and day <= 7)):
return timedelta(hours=13) # NZDT (UTC+13)
else:
return timedelta(hours=12) # NZST (UTC+12)
# Update each feature with local time conversion
updates = []
for feature in features:
utc_time = datetime.fromtimestamp(feature.attributes["timestamp_field"] / 1000) # Convert from milliseconds
local_time = utc_time + get_nz_offset(utc_time)
feature.attributes["local_timestamp_field"] = int(local_time.timestamp() * 1000) # Convert back to milliseconds
updates.append(feature)
# Apply updates to the feature layer
layer.edit_features(updates=updates)
# you can instead of putting it back into another local time field
# put the date as strings
print("Time zone conversion completed!")
... View more
05-09-2025
02:43 AM
|
0
|
1
|
2200
|
|
POST
|
The replace layer option in the ArcGISPro share dialog needs exclusive access. It crashes if I also have the layer open in AGOL. That might also be a problem if the layer is being used in Survey123 or FieldMaps. It is also not dynamic. New records will need to be processed. I thought of a virtual field, but that might not be possible.
... View more
05-08-2025
02:18 PM
|
0
|
0
|
2243
|
| Title | Kudos | Posted |
|---|---|---|
| 1 | 09-15-2024 10:32 PM | |
| 1 | 03-12-2026 01:10 AM | |
| 1 | 03-13-2026 08:30 PM | |
| 1 | 03-13-2026 05:17 PM | |
| 1 | 03-12-2026 05:14 PM |
| Online Status |
Offline
|
| Date Last Visited |
03-13-2026
05:04 PM
|