|
POST
|
The number of records determines what method (GetCount or da.SearchCursor) is faster. If the number of records is quite large, GetCount is faster, otherwise the searchcursor one liner is faster. I use the latter method (cursor) in some code that recursively (and quickly) needs to determine selected set counts. Jason - Curious why you indicate the cursor method is a poor choice (locks?, memory?, ???). import arcpy, time
fc = r"C:\csny490\overlay_20130620\ldo_20130620\ldo_database.gdb\crew_code"
time1 = time.clock()
result = int(arcpy.GetCount_management(fc).getOutput(0))
time2 = time.clock()
print "GetCount (" + str(result) + "), took " + str(time2-time1) + " seconds..."
time1 = time.clock()
result = len([r[0] for r in arcpy.da.SearchCursor(fc, ["OID@"])])
time2 = time.clock()
print "SearchCursor (" + str(result) + "), took " + str(time2-time1) + " seconds..." Some results: Table with 8 records: GetCount (8), took 0.755854384327 seconds... SearchCursor (8), took 0.136624540949 seconds... Table with 81k records: GetCount (81327), took 0.996590826688 seconds... SearchCursor (81327), took 0.648470391747 seconds... Table with 2 mm records: GetCount (2133404), took 4.37459715564 seconds... SearchCursor (2133404), took 11.5886194669 seconds...
... View more
03-25-2014
11:58 AM
|
3
|
0
|
16528
|
|
POST
|
If you aren't already, you may find much better performance in copying the oracle coordinate table locally, creating the polygon geom/feature class locally, and then appending the new features to your SDE-based featureclass What you are doing seems legit, but reading/writing across the network can be quite slow. Usually I copy everything local first, create the product/data, and then post to the network.
... View more
03-25-2014
11:31 AM
|
0
|
0
|
504
|
|
POST
|
Sadly I did not - but it would be pretty easy to include using the code example above. In fact it could be easier, since all you really would need to calculate is slope of each segment (and then calculate a mean, or max or whatever). It'd be even cooler to "length weight" the mean angle/slope values too... You know - an even cooler researchy thing... Go drive around in a car with a GPS (tracking on), and build a data driven model based on actual observations (what exactly is the correlation between geometry complexity and speed). Logging roads would be way more fun than the city... Maybe get a turbo WRX or something to help with the field recon :cool:.
... View more
03-11-2014
02:02 PM
|
0
|
0
|
742
|
|
POST
|
Okay I watched a YouTube thing, so I now should theoretically have enough info to submit stuff directly via github myself... I'll get an account soon! BTW: Those functions I showed probably shouldn't be included since I think you already have some that do very similar things in your arcapi... Just demonstrating that many of us (at least me) have created similar arcpy-based libraries... although as far as I know, not shared/created them as a cooperative process. I've been meaning to update my stuff for a long time, which is really a hodgepodge... hope to have some time in a few weeks. On a similar note, I've always thought it'd be cool to have a single "collaboratively built" toolbox crammed full of all sorts of neat geoprocesisng script tools.... Kind of like a analysis tools for analysts sort of thing. Wonder if that could work via github at some point too? Anyway - Good job!!!
... View more
03-05-2014
03:15 PM
|
0
|
0
|
2448
|
|
POST
|
I guess this shows my ignorance of the internet today, but if we have code/ideas how would it be best to communicate that with you fellows? I'm sure there's some fancy github way of doing all that... Sadly I'm pretty ignorant of that and stuck in the internet of like 2003 for some reason. Looks like these functions (and frankly, much better/fancier ones!) are already in your arcapi library, but for what it's worth comparing independent invention and all that, here's a few of mine I use all the time... I'd love to contribute some time to ideas/code sometime. Send me an email: chris.snyder( at ) dnr.wa.gov def listFields(inputTable, wildCard = "", fieldType = ""):
"""Lists fields (and selected properties) for inputTable"""
fieldList = arcpy.ListFields(inputTable, wildCard, fieldType)
#FORMATTING INFO: NAME = 50 spaces, TYPE = 15 spaces, LENGTH = 15 spaces, SCALE = 15 spaces, PRECISION = 15 spaces
print "NAME " + "TYPE " + "LENGTH " + "SCALE " + "PRECISION "
print "-" * 100
for field in fieldList:
print str(field.name)[0:50] + " "*(50-len(field.name)) \
+ str(field.type)[0:15] + " "*(15-len(str(field.type))) \
+ str(field.length)[0:15] + " "*(15-len(str(field.length))) \
+ str(field.scale)[0:15] + " "*(15-len(str(field.scale))) \
+ str(field.precision)[0:15] + " "*(15-len(str(field.precision)))
def listRecs(inputTable, numberOfRecordsToList = 25, whereClause = ""):
"""Lists field names and corresponding field values in inputTable"""
fieldNamesList = [f.name for f in arcpy.ListFields(inputTable)]
rowCount = 1
recordCount = int(arcpy.GetCount_management(inputTable).getOutput(0))
if int(numberOfRecordsToList) > recordCount:
numberOfRecordsToList = recordCount
searchRows = arcpy.da.SearchCursor(inputTable, ["*"], whereClause)
searchRow = searchRows.next()
while rowCount <= int(numberOfRecordsToList):
print "RECORD #" + str(rowCount)
print "-" * 50
for fieldName in fieldNamesList:
try:
print fieldName + ": " + str(searchRow[fieldNamesList.index(fieldName)])
except:
print fieldName + ": !?!"
print ""
print ""
rowCount = rowCount + 1
searchRow = searchRows.next()
del searchRow, searchRows
print ""
print ""
print "LISTED " + str(numberOfRecordsToList) + " RECORDS"
... View more
03-05-2014
01:28 PM
|
0
|
0
|
2448
|
|
POST
|
Beautiful! I built something similar (but definitely not as fancy) a while back... like your arcapi, my lmpy basically made python/arcpy more of a command prompt sort of interface (for those that like that sort of thing). Hope to contribute to your project sometime... if I ever have time again to do that sort of thing :rolleyes: Good job Filip and Caleb!
... View more
03-05-2014
12:00 PM
|
0
|
0
|
2448
|
|
POST
|
You could also set up a loop at the beginning of your script to check if a license is available maybe every 10 seconds or so... Maybe after a duration of a few hours it would then give up trying... Something like this (very untested) code: import time, sys, arcpy
licFlag = False
time1 = time.clock()
time2 = time.clock()
while licFlag == False or ((time2 - time1) / 3600) < 2):
time2 = time.clock()
if arcpy.CheckProduct('ArcInfo') == 'Available':
licFlag = True
else:
time.sleep(10)
if licFlag == False:
sys.exit(1) #give up
... View more
02-13-2014
09:15 AM
|
0
|
0
|
4145
|
|
POST
|
Comprehensions (list and dictionary flavors) are pretty darn cool, but in reality they are just a shortened/efficient form of a for loop. And yes, if the point is inside the poly, it has a distance of 0, otherwise, the closest distance is returned. Seems like what I recall with geom objects in a dictionary (for me at least a few years back) was that certain reported properties, such as .area or .length were incorrect as compared with the geom object itself (not stored in a dictionary)... basically like the ESRI geom object wasn't 100% (but maybe 80%?) compatible with Python. As I recall Jason Scheirer from ESRI also chimed in with some info on this at the time, but again I can't seem to find the post! Question for any ESRI people in the know (cough... Jason): Are geometry objects stored in Python dictionaries 100% kosher now? Have they always been or am I getting dementia already?
... View more
01-24-2014
08:25 AM
|
0
|
0
|
523
|
|
POST
|
Okay, so I guess the geom objects seem to be working as they should in the dictionary. I can't find that post from Kim Oliver about her research into the topic, but in this case it seems to be working fine. Anyone from ESRI remember something about this? Anyway, here's the fastest way I could think to do it (took 1.6 seconds to process 1000 pnts). I didn't have any zip code data handy, so I hacked together an example that uses counties instead. My randomly placed points were randomly assigned a county code. Nina, I added some comments in the code here as you suggested, but without some Python experience they probably won't help much. #assumes the pnts and polys are in the same PRJ, distances are in PRJ's map units...
import arcpy, time
#a pnt FC, with a field that has the claimed county code
pntsFC = r"C:\csny490\temp\test.gdb\random_pnts"
#polygon FC of counties (with a field for COUNTY_CD)
polyFC = r"C:\csny490\temp\test.gdb\counties"
#add a field to store the distance
arcpy.AddField_management(pntsFC, "DIST_TO_CNTY", "DOUBLE")
#put the county geometry into a dictionary for faster lookups
polyGeomDict = {r[0]:(r[1]) for r in arcpy.da.SearchCursor(polyFC, ["COUNTY_CD","SHAPE@"])}
#start a timer
time1 = time.clock()
#initialize an update cursor
updateRows = arcpy.da.UpdateCursor(pntsFC, ["SHAPE@","CLAIMED_CD","DIST_TO_CNTY"])
#a for loop to iterate through the points
for updateRow in updateRows:
#get a hook to the 1st two updatecursor field values
pntObj, countyCode = updateRow[0:2]
#set the DIST_TO_CNTY value to be the distanace of the pnt to the specified polygon
updateRow[2] = pntObj.distanceTo(polyGeomDict[countyCode])
#commit the update to disk
updateRows.updateRow(updateRow)
#del the update cursor objects
del updateRow, updateRows
#stop the timer
time2 = time.clock()
#report the time it took to calc the distances
print "Took " + str(time2 - time1) + " seconds!"
... View more
01-22-2014
10:07 AM
|
0
|
0
|
3124
|
|
POST
|
As I have no experience with pyton yet Hi Nina - The sooner you learn, the better. If it helps, here's the documentation for: search cursor: http://resources.arcgis.com/en/help/main/10.1/index.html#//018w00000011000000 update cursor: http://resources.arcgis.com/en/help/main/10.1/index.html#/UpdateCursor/018w00000014000000/ geometry objects: http://resources.arcgis.com/en/help/main/10.1/index.html#/Geometry/018z00000070000000/ ... which are the "bread and butter" of what you are trying to do.
... View more
01-21-2014
08:04 AM
|
0
|
0
|
3124
|
|
POST
|
Hi Neil, I will try it again. In previous efforts (see: http://forums.arcgis.com/threads/9555-Modifying-Permanent-Sort-script-by-Chris-Snyder?p=30332&viewfull=1#post30332) I found that the geom objects somehow got corrupted and basically became useless... but interestingly... only if the map units were in non-metric units. So for example, UTM meters worked fine... State Plane Feet (like I use) was totally messed up for some reason. I think Kim Olivier did some deeper digging on this as I recall. I'll try again and let you know.
... View more
01-21-2014
07:55 AM
|
0
|
0
|
3124
|
|
POST
|
"and the zip code border of the zip code the particular person named at the beginning of the survey" Okay, I guess you would have to write a script. Probably a better/faster way to do this (how many people pnts do you have)? I wonder if the geom objectects still get corrupted if they get put into a Python dictionary?? Perfomace trick would be to access the geometry of the zip code polys as quickly as possible... Anyway, some code (UNTESTED!) might look like this: #assumes the pnts and zipcode polys are in the same PRJ, distances are in PRJ's map units...
peoplePntsFC = r"C:\temp\people.shp" #a pnt FC, with a field that has the claimed zip code
zipCodeFC = r"C:\temp\zipcodes.shp" #polygon FC of zipcodes
arcpy.AddField(peoplePntsFC, "DIST_TO_ZIP", "DOUBLE")
updateRows = arcpy.da.UpdateCursor(peoplePntsFC, ["SHAPE@", "ZIP_CODE" "DIST_TO_ZIP"])
for updateRow in updateRows:
pntObj = updateRow[0]
zipCode = updateRow[1]
zipPolyObj = arcpy.da.SearchCursor(zipCodeFC, ["SHAPE@"], "ZIP_CODE = " + str(zipCode)).next()[0]
updateRow[2] = pntObj.distanceTo(zipPolyObj)
updateRows.updateRow(updateRow)
del updateRow, updateRows
... View more
01-16-2014
12:56 PM
|
0
|
0
|
3124
|
|
POST
|
You could write a script to do this, but how about just using the Near tool: http://resources.arcgis.com/en/help/main/10.2/index.html#//00080000001q000000
... View more
01-16-2014
12:32 PM
|
0
|
0
|
3124
|
|
POST
|
I also sometimes add this optimization step (a select by location - sometimes a complex series of them) to many of my scripts that use overlay tools. It doesn't seem to help too much for smaller datasets, but does a lot for some larger datasets. I theorize that this method can often whittle the input features down enough so as to keep the 'LargeOverlayTiles' background process from firing up. This overlay tiles thing (while well intentioned for sure and pretty much hidden from view) can add unexpectedly and unreasonably long processing times to overlay tasks.
... View more
01-16-2014
08:28 AM
|
0
|
0
|
697
|
|
POST
|
Not sure I understand your purpose here, but the most efficient way to store a pixel value from a raster is.... the original raster. What information/analysis are you wanting to do? Knowing that will dictate the best way to proceed. That said, I noticed that v10.2+ now supports multiband rasters in the RasterToNumpy tool: http://resources.arcgis.com/en/help/main/10.2/index.html#//018v00000023000000 Otherwise you would have to cobble several 2D arrays together yourself, which I believe there are methods to do this. Numpy is probably your best bet here I think, but I have little/no expertise in it's use.
... View more
01-13-2014
10:25 AM
|
0
|
0
|
4397
|
| Title | Kudos | Posted |
|---|---|---|
| 1 | 03-25-2014 12:57 PM | |
| 1 | 08-29-2024 08:23 AM | |
| 1 | 08-29-2024 08:21 AM | |
| 1 | 02-13-2012 09:06 AM | |
| 2 | 10-05-2010 07:50 PM |
| Online Status |
Offline
|
| Date Last Visited |
08-30-2024
12:25 AM
|