|
POST
|
A hosted featurelayer of half a million records? I assume it is on your own server, not ArcGISOnline? The storage charges are per 10MB, not per Gigabyte! Have you tried a virtual field(s) in the WebMap? Then you can use Arcade to generate the new fields dynamically.
... View more
05-08-2025
02:13 PM
|
0
|
1
|
2252
|
|
POST
|
I have just upgraded to 3.4.0 which is no better. It has been a problem ever since 3.3.1. Maybe due to BigInteger introduction?
... View more
05-08-2025
02:07 PM
|
1
|
2
|
1029
|
|
POST
|
Why are you using shapefiles and excel tables? You would be much better off using a proper database format with tables. Then you can validate the data and control the data schema. It will be faster and more robust. It will handle the difference between blanks and null values. You can use the Microsoft schema.ini file to properly define the schema for the excel spreadsheet before you import it to a database table. Once in a database the keys you are going to use for the table join can be indexed for speed, set to NOT NULL to make sure they are always populated and generally document the schema. The codes with dashes have to be strings, but that may need to be explicitly set.
... View more
05-07-2025
05:38 PM
|
0
|
0
|
1112
|
|
POST
|
It anything takes longer than a cup of coffee, interrupt the process and find a better way! It is likely that it will crash with no results if it has to run for days. To update an online table from ArcGISPro it only works for a few records. If you need a bulk update the best way is to export the table to a local filegeodatabase, do the process on a local machine and then replace the online layer. The replace layer function actually does a Staging Update. This zips up the file and metadata for the layers, uploads and then unpacks in the cloud. This all takes a few seconds or at the most minutes.
... View more
05-07-2025
05:30 PM
|
1
|
2
|
2277
|
|
POST
|
Use FME. That will work forever, they never deprecate earlier readers. It is also called Data Interop for the Esri addon if you want to pretend that it is an Esri extension.
... View more
05-07-2025
05:21 PM
|
2
|
0
|
1978
|
|
POST
|
I have a python script that built database relates in ArcMap in a filegeodatabase. They are many-to-many relates between three tables using a relationship table in the middle. This is the design from the source in PostGIS that I want to replicate. The keys between the tables are indexed integer fields. This all worked well in ArcMap and building the relates took a few minutes. Now in ArcGISPro the same script takes many hours! What has happened? [As a side-issue, BigInteger keys cause a crash, they must be 32 bit Integers] The tables are medium sized, typically 2M records. Note that you cannot rename any tables afterwards, you have to rebuild from scratch. All tables must be in the same filegeodatabase. The relation table built is a semi-hidden table, hard to delete in a script. Has anyone else noticed this? Do you have any suggestions to restore the speed? Maybe remove the indexes before building the relates and re-creating afterwards.
... View more
05-07-2025
05:13 PM
|
1
|
4
|
1100
|
|
POST
|
Have you considered the GroupBy function in Arcade? This seems to be very useful to group multiple names and get a count.
... View more
04-27-2025
03:49 PM
|
0
|
0
|
1708
|
|
POST
|
I made a mistake and deleted a featurelayer used in a dashboard. I found out in time (2 weeks) so I was able to restore it. That got me thinking. How could I back up (and restore!) my precious work? There are a few problems: 1. You can export the JSON file defining the dashboard, but that is not enough. You also need the items referenced by the dashboard. That will be a WebMap mostly. But wait... the WebMap references FeatureLayers and Tables. So there needs to be a way of making a list of dependencies that also must be backed up. 2. All the references in AGOL use item_id GIUD codes that are ReadOnly. This means that simply recreating a lost reference will not work unless it has the SAME id. 3. Trying to repair a Dashboard with the replacement item is impossible in the interactive interface. As soon as you change the source, ALL settings are removed. You may as well be starting again. So there are no simple tools to package up a project into a zip file and store for restoration later. This means that the work is very fragile. No wonder there is a flag to avoid accidental deletion (which I overrode!) But I am a python programmer. Surely there is a simple arcgis function or module to make this routine? Nothing that I could find. The closest is the ArcGIS Assistant which can do a bit of a hack on the JSON defining the items. So I attempted to write my own. This is harder that it seems, even with the useless help of Copilot. I had to keep correcting it for obvious lack of understanding of the JSON structure. The first step is to get a list of item_ids referenced in the Dashboard. This is hard, you have to do a recursive search or you won't get anything, even though you know they are there somewhere. Once you get the WebMap ids you can save their JSON and move on to the FeatureLayers. The FeatureLayers return a binary ServiceDefinition which is a rabbit hole. I just want the item_id. Then there is the Restore process. We will need a list of all these separate file dumps to return to AGOL together with the groups, owners and permissions. I can see why there is a market for third party products. So this is an unsolved workflow for me. Is there something I have missed? How do other people backup/restore projects on AGOL? Do you just put it in the too-hard basket?
... View more
04-04-2025
03:16 PM
|
4
|
0
|
1166
|
|
IDEA
|
But there is another problem not covered by my workaround. OBJECTID can be 32 or 64 bit as well. This is much harder to change and even detect. The type is "ÓID" for both (not helpful) but you can find out the length is 4 or 8. If the table is empty you can edit the field schema by hand to fix it, but I don't know how to do it in a script.
... View more
04-04-2025
02:34 PM
|
0
|
0
|
662
|
|
IDEA
|
You may need to fix a bug(?) that adds a blank field to the end of the arcpy.ListFields(fc) command. fields = arcpy.ListFields(input_fc)
print(f"L42 {[f.name for f in fields]}")
for field in fields:
if field.type not in ('OID', 'Geometry'): # not allowed in fieldmappings
if field.name: # very strange blank name added to list by Esri
field_map = arcpy.FieldMap() # new for each field
print(f"L46 {input_fc},<{field.name}>")
... View more
12-03-2024
03:35 PM
|
0
|
0
|
820
|
|
POST
|
We handle this by having a point layer of labels that relate back to the polygons. There is only one point for each property, but some properties are multipart. That doesn't matter because they still have a total area. You only get one label so it is placed in the largest part. You add labels when they have been exploded and then choose the largest one. Run a summary by max area and then select the largest, discarding the others. If you want to get rid of the non-boundaries there is an aggregate tool.
... View more
12-03-2024
04:05 AM
|
1
|
0
|
1511
|
|
POST
|
There are two obvious tools to create a set of index sheets based on the centroids. Do you have a sheet size in mind? Thanks for the sample that makes it much easier to experiment. The first thing I did was to project the data to a filegeodatabase so that I have feet units. My first attempt was to just create Thiessen polygons around the points. This works well for even spacing. There are a few long bits around the edges. You could trim them up and straighten the fuzzy bits by snapping to the larger index. I then tried a Graphic Buffer with a size of 290 metres. This also gave a good start, a bit of integration would snap the overlaps to a tidy grid.
... View more
12-03-2024
03:46 AM
|
0
|
0
|
663
|
|
IDEA
|
Even if the policy is correctly set for casting the ambiguous Integer type I still need a solution that works for me. I can see that computing has evolved from 16 to 32 to 64 bit integers and some are defined as signed and unsigned. I have seen LONG, MEDINT, INTEGER and SHORT as well as the new BigInteger to try to extend the meaning of 'LONG' which in 32 bit days meant only 32 bits. To keep backward compatibility the temptation is to reuse 'Long' for 64 bit when using a 64 bit compiler. I was interested to see that MEDINT was interpreted as Long and INTEGER as BigInteger going from a gpkg to a filegeodatabase. So redefining the gpkg field type might be a good workaround that avoids having to explicitly cast the fieldtype using a fieldmappings parameter with ExportFeatures_conversion() replacing CopyFeatures(). [note that sqlite does not care what you call it in the DDL, they are always 64 bit integers internally] The small print in the help also explains why the option setting in Options in ArcGISPro does not work for me. Because if running a script outside ArcGISPro in the command line you have to set the options in the arcpy environment. arcpy.env. But Double is useless for me. I want to use the field as an indexed key so it has to be some sort of integer. Insert below: useCompatibleFieldTypes (Read and Write) Specifies whether field types that are compatible with ArcGIS Pro 3.1 will be used. This setting relates to the use of the Date Only, Time Only, Timestamp Offset, and Big Integer field types in CSV tables and unregistered database tables with tools that create layers or table views. When set to True, a layer or table view created from these sources will use field types compatible with ArcGIS Pro 3.1. Date Only, Time Only, and Timestamp Offset fields will be displayed as Date fields, and Big Integer fields will be displayed as Double fields. When set to False, all original data source field types will be used. Note: This property is applicable when used from stand-alone Python or for a geoprocessing service. When used in ArcGIS Pro, this property will always match the Use field types that are compatible with ArcGIS Pro 3.1 and earlier releases when adding query layers and text files option. A better option I have implemented is to replace CopyFeatures_management() with my own function CopyFeaturesLong() that looks for any BigInteger arcpy interpretations and keeps them as a compatible Integer, not a Double. Here it is if you want to use it. There will need to be a similar function to replace CopyRows(). def CopyFeaturesLong(input_fc, output_fc):
"""CopyFeatures replacement
to ensure every BigInteger is mapped to Long
with a fieldmappings parameter using arcpy.conversion.ExportFeatures()
with a bit of help from copilot!
full paths to featureclasses assumed """
# Create FieldMappings and FieldMap objects
field_mappings = arcpy.FieldMappings()
# List of fields to process
# it would be nice if objectid and shape could be excluded easily
fields = arcpy.ListFields(input_fc)
for field in fields:
if field.type not in ('OID', 'Geometry'): # not allowed in fieldmappings
field_map = arcpy.FieldMap() # new for each field
field_map.addInputField(input_fc, field.name)
# Check if the field type is BigInteger and change to Long
field_name = field_map.outputField
if field_name.type == 'BigInteger':
field_name.type = 'Long'
field_map.outputField = field_name
# Add the FieldMap to the FieldMappings object
field_mappings.addFieldMap(field_map)
# Use ExportFeatures to apply the field mappings and create the output feature class
arcpy.conversion.ExportFeatures(input_fc, output_fc, field_mapping=field_mappings)
if debug:
for fld in arcpy.ListFields(output_fc):
print(fld.name,fld.type)
print(f"{output_fc} Feature class successfully exported with updated field types!")
return True
def CopyRowsLong(input_tab, output_tab):
"""CopyRows replacement
to ensure every BigInteger is mapped to Long
with a fieldmappings parameter using arcpy.conversion.ExportTable()
with a bit of help from copilot!
full paths to featureclasses assumed """
# Create FieldMappings and FieldMap objects
field_mappings = arcpy.FieldMappings()
# List of fields to process
# it would be nice if objectid and shape could be excluded easily
fields = arcpy.ListFields(input_tab)
for field in fields:
if field.type not in ('OID', 'Geometry'): # not allowed in fieldmappings
field_map = arcpy.FieldMap() # new for each field
field_map.addInputField(input_tab, field.name)
# Check if the field type is BigInteger and change to Long
field_name = field_map.outputField
if field_name.type == 'BigInteger':
field_name.type = 'Long'
field_map.outputField = field_name
# Add the FieldMap to the FieldMappings object
field_mappings.addFieldMap(field_map)
# Use ExportFeatures to apply the field mappings and create the output feature class
arcpy.conversion.ExportTable(input_tab, output_tab, field_mapping=field_mappings)
if debug:
for fld in arcpy.ListFields(output_tab):
print(fld.name,fld.type)
print(f"{output_tab} Table class successfully exported with updated field types!")
return True
... View more
11-02-2024
03:45 PM
|
0
|
0
|
6046
|
|
POST
|
Correction: you cannot hide fields in Pro, they all have to be visible.
... View more
09-28-2024
08:04 PM
|
0
|
0
|
3114
|
|
POST
|
prefix = '\"%s\"' % sys.prefix
!conda install --yes --prefix {prefix} geopandas
... View more
09-19-2024
04:10 AM
|
2
|
0
|
1474
|
| Title | Kudos | Posted |
|---|---|---|
| 1 | 09-15-2024 10:32 PM | |
| 1 | 03-12-2026 01:10 AM | |
| 1 | 03-13-2026 08:30 PM | |
| 1 | 03-13-2026 05:17 PM | |
| 1 | 03-12-2026 05:14 PM |
| Online Status |
Offline
|
| Date Last Visited |
03-13-2026
05:04 PM
|