POST
|
Actually you cannot use numeric codes for text fields any more at ArcGISPro in domains.They only allow ranges with digits. It does half work in ArcMap but the text is always right justified and the widths go wonky. So even if you want to use digits, for say sorting they have to be character strings. I still hate this because our postal service uses leading zeros. My postcode is "0626" but if Excel get hold of that it morphs into 626 and so on. Sometimes export tools allow you to choose between the code and description but since I am a programmer the codes will do just fine because you Have to use the code in SQL expressions anyway.
... View more
02-11-2024
02:35 AM
|
0
|
0
|
102
|
POST
|
The basic issue is that in the interactive window the layers are recognised as objects and the underlying source is found. When in a script the layers are not visible because you are not in ArcGIS. You have to use the path to the featureclass. If you still want to use layer names, you could save the layer to a lyrx file and read that in as a layer object which has a property to find the featureclass path. In the script tool you could add parameters that ask for a 'layer name'. Then you would have to guess the featureclass is the same as the layer. But if you ask for a layerobject then you will get an object that knows the source featureclass. Tip: when debugging don't use the try/except error trapping. Just let it crash and you will get better feedback on where and why it has failed. If you do add the trap, then print what the error was. In a standalone script to make a layer you have to 1. Define the featureclass and path, 2. my_layer = arcpy.MakeFeatureLayer(...). You can create a filter, select by location and all the other fancy things that a layer makes possible without making a copy of the featureclass. Then the layer name can be used like your interactive script. Adding print statements only show when you run the script from an editor. A good thing to do because you will trap all the syntax errors earlier and you can add break points and print intermediate results. I use arcpy.management.GetCount() a lot to stop if there are no records selected and arcpy.Exists(fc) to see if I have got the data.
... View more
02-11-2024
02:21 AM
|
0
|
0
|
188
|
POST
|
No, cannot be done. Not supported. Anyway it only works for a demo, not large datasets. I find I have to do my own 'join' using dictionaries. This turns out to be faster, more reliable and more scaleable. One benefit is that you do not have to have the data in the same database because you are using Python structures as an intermediary. You do get into the weeds of cursors, dictionaries and trapping errors a bit. Actually with Pandas now a default package installed you may be able to streamline a join. Must look into that. My legacy workflow is like this: 1. Describe the two fc or tables and get the schema for the table you wish to join, plus the keys. Or use arcpy.ListFields() 2. Choose the foreign key and user-fields you wish to join. (Make a list of names) 3. Add the new fields to the target featureclass, or perhaps a copy that you want to create. 4. Use a SearchCursor to create a dictionary of the table data, with the foreign key as the dictionary key and the attributes as a tuple. You will not be needing the shape field, OBJECTID, or dynamic fields for area, perimeter. 5.Open an UpdateCursor on your target featureclass with a list of the field in the tuple. Iterate through the table and update the new empty fields using the dictionary and key. Best to use dict.get(key, None) to avoid missing keys. It all sounds a bit of work, but once you have the pattern it's easy to adapt to the next project. It is really really FAST. If you have billions of records you can always do some partitioning, but you should be OK with millions of records. There is an example of a table join in my Python Tips talk
... View more
02-11-2024
02:02 AM
|
0
|
0
|
220
|
POST
|
Here is an example of finding the latest visit record (using Pandas) and then transferring that to the parent location table. # current_status.py
# from Visits_Table put back status, edit date, difficulty on WeedLocations
# use Pandas for ease, speed, simplicity
# 19 Sept 2022 latest schema, different gdb
# 12 Oct 2022 change to DateCheck from EditDate but keep EditDate as last record
import sys
import arcpy
import pandas as pd
import collections
from datetime import datetime
try:
gdb = sys.argv[1]
except IndexError:
disk = sys.argv[0][0]
gdb = '{}:/project/econet/source/cams_weed.gdb'.format(disk)
start = datetime.now()
if not arcpy.Exists(gdb):
raise IOError
arcpy.env.workspace = gdb
arcpy.env.overwriteOutput = True
arcpy.AddMessage(gdb)
debug = True
# two tables in gdb
weeds = 'WeedLocations'
visits = 'Visits_Table'
# basic attributes to be transferred from Visits to WeedLocations, not validated yet
visit_to_weed = {
'Guid_visits': 'GlobalID', ## 0 foreign key -> primary key
'DateCheck':'DateVisitMadeFromLastVisit', ## 1 for latest date Note not the same as EditDate
"WeedVisitStatus":'StatusFromLastVisit', ## 2 as inspected
'DifficultyChild':'DifficultyFromLastVisit', ## 3 as inspected
'VisitStage':'LatestVisitStage', ## 4 as inspected
'Area':'LatestArea', ## 5 as inspected
'DateForReturnVisit':'DateForNextVisitFromLastVisit', # 6 calculated
'EditDate':'EditDate' ## dummy for pandas to find latest record
}
in_flds = list(visit_to_weed.keys())
out_flds = list(visit_to_weed.values())
filter = '' #"""EditDate > date '{}'""".format('2022-07-01')
vdate =[row for row in arcpy.da.SearchCursor(visits,in_flds,filter)]
print('vdate:',len(vdate))
# put in a pandas dataframe and process woohoo
df = pd.DataFrame(vdate,columns=in_flds)
# find the record with max edit date by visit and keep the other details all in one line!
idx = df.groupby(['Guid_visits'])['EditDate'].transform(max) == df['EditDate']
dVisit = df.set_index('Guid_visits').T.to_dict('list')
# Count visits for each location
vguid = [row[0] for row in arcpy.da.SearchCursor(visits,['Guid_visits'], "Guid_visits is not NULL")]
# dict of counts by GlobalID for updating
vguid_counts = collections.Counter(vguid)
# update weeds with visit count and latest details
with arcpy.da.UpdateCursor(weeds, ['VisitCount'] + out_flds) as cur:
n = 0
for row in cur:
try:
row[1] = vguid_counts[row[0]]
if dVisit.get(row[0],None):
row[2] = dVisit.get(row[0],None)[0]
row[3] = dVisit.get(row[0],None)[1]
row[4] = dVisit.get(row[0],None)[2]
row[5] = dVisit.get(row[0],None)[3]
row[6] = dVisit.get(row[0],None)[4]
cur.updateRow(row)
n+=1
except Exception as e:
arcpy.AddMessage(row)
arcpy.AddMessage(e)
print("Well Done, {} records updated in {}".format(n, datetime.now() - start))
... View more
03-17-2023
02:46 PM
|
0
|
0
|
362
|
POST
|
I am afraid that relationship classes are not supported in arcpy. You will have to create your own equivalent, which as it turns out is much better and faster anyway. 1. Read in the related tables using a SearchCursor inside a list comprehension to get an in_memory list of records with the key as the one of the fields. 1a. Convert the list to a Pandas dataframe. 2. Use Pandas to find the minimum or maximum values with a groupby of the foreign key 3. Make a dictionary of the results of the statistics 4. Run an UpdateCursor on your featureclass and use the dictionary of min/max values to update the featureclass. This will be lightning fast, reliable and easy to understand. You might add some error trapping such as use .get(value,none) instead of a lookup to avoid missing values. See example.
... View more
03-15-2023
06:41 PM
|
1
|
1
|
386
|
POST
|
Maybe start with a temporary layer of start nodes and end nodes from the pipelines? There is a tool to do this. Then overlays of the node points would give you the ends. Keep track of the objectids to get references to the original lines.
... View more
03-12-2023
03:07 PM
|
0
|
0
|
550
|
POST
|
ModelBuilder is straining to support iterators. Much better to start using Python for iterations. You are still using the same functions but the iteration is much easier to understand and you can have much better control of missing data, errors and the process is much more understandable. You can use Cursors that are much faster. Start your upgrade by taking each function and right-click to create a python snippet. Then put them all in a script. You can also encapsulate the iterator into a single script and publish it as a custom tool and leave the rest of the process in ModelBuilder.
... View more
03-11-2023
03:54 PM
|
0
|
0
|
372
|
POST
|
To operate tools on data stored on disk you always have to create in memory definitions or bitmaps first. MakeTableView is much simpler, works on one table and is more reliable. It can make a subset of records and a subset of fields, but sometimes other tools get confused eg Intersect overlays. MakeQueryTable attempts to set up a relational database expression across several tables in the same database. It may work for sample sets, but never works for me with real sized datasets. After trying it for a while I avoid it at all costs. Often it just hangs or crashes after hours. So what to do? If you are using Model Builder or are trying to do interactive selections you are stuck with it. Simplify and reduce first by exporting to a temporary featureclass or partition into smaller sets and repeat. If you are using Python there are much better options. You can copy the data straight into Numpy or Pandas and run python functions. Note SQL expressions have to be all in one database. Break this by extracting the whole dataset use a SearchCursor() then use python set functions to do joins. This means you are not limited to one database. You can use python dictionaries that are very fast lookups and can be very large say 1 million records. I often use list comprehensions with a SearchCursor to extract the table. Then loop through another table in another database with the dictionary. Alternatively I use a huge list with an SQL IN clause to do the selection, make sure the field is indexed in the database and you will have your selection in milliseconds compared to hours with MakeQueryView.
... View more
03-11-2023
03:47 PM
|
2
|
0
|
1146
|
POST
|
Thanks, have been laid low with an achillies tendon break and covid and family cancer. Back soon.
... View more
03-11-2023
02:33 PM
|
0
|
0
|
124
|
POST
|
This is even tricky for polygons. Esri enforce a convention of outside clockwise with holes counterclockwise. This is the Right Hand Rule and defines which side of a polygon is UP. But OGC forgot to define this so lots of open tools do not enforce this rule causing spatial analysis errors. To standardise run the Repair Geometry Tool. Note all KML polygons will be corrupt after importing for example. But what about polylines? There is no convention here but often it matters, such as when labelling text, now more sophisticated thank goodness. But there is no From-To convention for polylines. Even finding the start and end nodes is no longer supported with 'shape files'. It was for coverages with built-in arc node topology. Someone could write a small python script to find the start and end points and the midpoint (all available functions) and then you need to do some clever vector geometry calculating the vector cross product to see if the polyline curves to the left or right. Then flip all required polylines. Another approach might be to close the start and end points temporarily to create polygons, then repair geometry, find the ones that have been reversed (repaired) and those are the source polylines that need flipping. You might be able to use Linear Referencing. When you define routes you nominate the corner of the extent for the start of each route. This makes all the routes run in the same direction. Not quite clockwise but directed. When calculating a convex hull of a cloud of points the boundary polyline is a clockwise polyline. Maybe you can adapt the very simple algorithm to achieve a similar result. Or make a convex hull and then order your data. Now I have thought that I don't really understand your question! A picture would help. Do you want all your polylines looking like a stream, or is each one independent? I am very curious as to why you need them flipped.
... View more
03-11-2023
02:19 PM
|
0
|
2
|
1231
|
IDEA
|
I have been starting to use the Python API and I have found it very difficult to understand the functions. The samples are just 'showoff' examples of why you would use GIS, not how to debug the tools. I find that the arcpy documentation is much more helpful because there are example snippets on every function right in the help, not somewhere in a sample. The edit tools have 16 samples for the whole module. Most of the functions and parameters are not covered with an example anywhere. I cannot find a working example of extract_changes(). I note in the community noticeboard that there was a bug in 2020 where 'optional' parameter for the URI were actually required. Has this changed? Let's see something working so I can backtrack to see why my environment is different. I thought the python Open API on Github would have the source for arcgis! Some hope there, it only has a smattering of examples that give a once over lightly of each module. My initial thought was that since there was a missing function where I was porting an arcpy script I would add one myself - open source, published... (I can see a child table separately from the parent table in the GUI, so why not in the API?)
... View more
10-04-2022
10:13 PM
|
0
|
0
|
533
|
POST
|
There is already a function to validate the SQL string built in to the API. Very straightforward. https://developers.arcgis.com/python/api-reference/arcgis.features.html#arcgis.features.FeatureLayer.validate_sql
... View more
10-04-2022
04:01 PM
|
1
|
0
|
930
|
POST
|
I am searching for a working example of extract_changes() too. I cannot understand the error messages. I did see a post from 2020 where it does not accept defaults as documented, you have to add in the URI for example.
... View more
10-04-2022
03:49 PM
|
0
|
2
|
1297
|
POST
|
What you have to do is add in the arcgis module as well as arcpy. Then you can access AGOL tools which include the Geocoding Service. Or switch to the geocoding service direct from ArcGISPro. Thefunctions and parameters are different but it is the same service ultimately, and will also cost credits.
... View more
09-13-2022
10:22 PM
|
0
|
0
|
1678
|
Title | Kudos | Posted |
---|---|---|
2 | 02-22-2024 01:25 AM | |
1 | 02-22-2024 01:57 AM | |
1 | 03-15-2023 06:41 PM | |
2 | 03-11-2023 03:47 PM | |
1 | 09-04-2022 03:45 AM |
Online Status |
Offline
|
Date Last Visited |
Friday
|