|
POST
|
I dont understand why my result object is coming up empty. Doh! Index Values help... CODE:
childfc = r"C:\Temp\MyGDB\Stores"
parentfc = r"\\sharedrive\ParentGDB\Stores"
result = arcpy.FeatureCompare_management(parentfc, childfc, "DID", "ATTRIBUTES_ONLY", "", "", "", "", "", "", "NO_CONTINUE_COMPARE")
print "Comparison Result is: " + result.getOutput(1)
result.GetOutput(1)
Shell Results:
Comparison Result is: false
u'false'
... View more
05-22-2013
05:40 AM
|
0
|
6
|
3105
|
|
POST
|
So digging around a little more, I found a nifty little featurecompare function in the management toolbox. It seems to give me a decent solution to flagging child fcs that are different from the parentfc. According to the the ArcGIS Resources Help File: "The comparison tools result object will be 'true' when no differences are found and 'false' when differences are detected." So I should be able to return a True or False from the result object. It seems however that my result object is not returning anything. Does it matter that the result object is a unicode string? I dont understand why my result object is coming up empty. CODE:
childfc = r"C:\Temp\MyGDB\Stores"
parentfc = r"\\sharedrive\ParentGDB\Stores"
result = arcpy.FeatureCompare_management(parentfc, childfc, "DID", "ATTRIBUTES_ONLY", "", "", "", "", "", "", "NO_CONTINUE_COMPARE")
print "Comparison Result is: " + result
result.GetOutput
Shell Results:
Comparison Result is:
<bound method Result.getOutput of <Result ''>>
... View more
05-22-2013
05:23 AM
|
0
|
0
|
3105
|
|
POST
|
I've never used ArcObjects via Python but you can: http://gis.stackexchange.com/questions/80/how-do-i-access-arcobjects-from-python sweet. I'll give it a go. thanks!
... View more
05-21-2013
02:27 PM
|
0
|
0
|
6203
|
|
POST
|
Since it's not currently possible through anything I've seen in Python, I've added it to the ideas site.https://c.na9.visual.force.com/apex/ideaView?id=087E00000004eOq
... View more
05-21-2013
01:45 PM
|
0
|
0
|
6203
|
|
POST
|
By the looks of this http://gis.stackexchange.com/questions/23914/how-to-get-the-size-of-a-file-geodatabase-feature-class-on-disk you could probably get at this info with ArcObects someway. edit: yup.. http://gis.stackexchange.com/questions/24242/how-to-programmatically-determine-the-size-of-a-feature-class-in-a-file-geodatab ArcObjects will get it done. Thanks James, I actually saw this before posting here, but I'm not sure how to leverage ArcObjects from Python, if I can at all.
... View more
05-21-2013
01:28 PM
|
0
|
0
|
6203
|
|
POST
|
Thanks James, I initially thought the same thing but was skeptical since Windows doesn't natively understand the GDB format. Needless to say, this raises an error.
... View more
05-21-2013
01:26 PM
|
0
|
0
|
6203
|
|
IDEA
|
It would be really nice to be able to return the 'File Size' and 'Date Modified' properties of a feature class through the arcpy.describe function. This would be really helpful in helping to ensure that users are current with their datasets. something like: importarcpy# Create a Describe object from the feature class#desc=arcpy.Describe("C:/data/arch.dgn/Point")# Print some feature class properties#print"File Size: "+desc.fileSizeprint"Date Modified : "+desc.dateModifiedprint"Read/Write: "+ desc.permissions
... View more
05-21-2013
07:27 AM
|
54
|
4
|
7138
|
|
POST
|
There doesn't appear to be a function in arcpy to return the file size of a feature class or it's date modified timestamp. Anyone have any idea how to do this? I know there is a property somewhere because you can setup ArcCatalog to display the information. However, I want to see it returned in a Python Script so that I can write a two-way synchronization function.
... View more
05-21-2013
06:19 AM
|
2
|
30
|
25555
|
|
POST
|
If you added a field index you'd get the time back in spades when you ran the Select tool -- even faster! That's a good idea, may play with that tomorrow!
... View more
05-01-2013
12:15 PM
|
0
|
0
|
3094
|
|
POST
|
Neat. I may steal that. Hey Curt, I thought about doing a MakeFeatureLayer and then running a series of SelectLayerbyAttributes but from my experience, that significantly increases processing time. Not a big deal with small to moderate sized datasets, but with 3.2 million records, I didn't want to play with those implications. Unfortunately, I can't take credit for the UniqueValues definition either, I lifted it off of an old Cafe Python post. Someone commented on the post about leveraging the TableToNumPyArray function to get better performance, which made me really excited, but alas my feeble mind could not get it to work like I wanted.
... View more
05-01-2013
12:00 PM
|
0
|
0
|
3094
|
|
POST
|
Chris, I played with this notion earlier but couldn't get it working... I modified your code a bit to export to a feature dataset and also provide print statements as it ran, but all in all, right on the money. Thanks a ton! It's probably going to take several hours to crunch through all the data but it seems to be chugging along. The only issue I see is that since I'm running it in the Python Window, it keeps adding each dataset to my TOC. I didnt really expect that. It's not a huge deal because drawing is paused, but nonetheless, something I didnt expect to happen. So long as it crunches through all the data, I'm going to take a stab at turning it into an Addin later. For anyone that wants the full code if you're having issues with large datasets, here's what I'm using:
import arcpy
from arcpy import env
# Create Workspace. 'Select_analysis' tool exports to the current workspace, so this is the directory
# where your exported datasets will end up. I'm dropping mine in a Feature Dataset in the 'Default.gdb'
# on my system.
workspace = arcpy.env.workspace = "C:\Users\Username\My Documents\ArcGIS\Default.gdb\CustomersByStore"
# Create a variable to hold the FC you want to split by attribute.
fc = "C:\Users\Username\My Documents\ArcGIS\Default.gdb\Customers\MyCustomers"
# Instantiate a 'set' of unique values from the 'StoreID' field in the 'fc' dataset using the native
# python 'set' class. Depending on how large your dataset is, this could take a bit of time.
# Mine was 3.2 million records and only took 3-4 minutes to run.
StoreSet = set([r[0] for r in arcpy.da.SearchCursor (fc, ["StoreID"])])
# Create the 'for' loop that will iterate through your BranchSet by unique ID.
for ID in BranchSet:
# Create a variable to name each exported record according to the unique value. If your field is an
# integer, ensure you convert it to string. For some reason, just converting the integer to string
# wasn't enough though, I had to also tack on a string ('Customers') as well before the conversion
# to get it all working. By default, Select_analysis also exports to the current workspace, so if you
# didn't set one as I did before, you need to hardcode the path here.
OutFC = "Customers_" + str(DID)
# Print which recordset the script is currently processing
print "Exporting " + str(DID) + "..."
# Export the dataset using the 'Select_analysis' function. Because my variable for 'OutFC' also
# holds the variable for my workspace, it drops all the datasets in my workspace.
arcpy.Select_analysis(fc, OutFC, "StoreID = " + str(DID))
print "Script Complete."
... View more
05-01-2013
11:50 AM
|
0
|
0
|
3094
|
|
POST
|
I have a Feature Class with 3.2 million records which represents customers. Each customer is assigned to a particular store ID (DID) So I'm trying to split the customer record into store ID datasets. That is, for each store ID (DID), create a dataset of the customers assigned to it, essentially splitting the FC by each unique attribute in the field. I was able to pull up a list of the unique store IDs by using: def Unique_Values(table, field): with arcpy.da.SearchCursor(table, [field]) as cursor: return sorted({row[0] for row in cursor}) fc = "C:\Users\username\Documents\ArcGIS\Default.gdb\MyCustomers" field = "DID" UniqueValues (fc, field) which resulted in a list of all of my unique store IDs, 957 in all. What I'm not sure of however, is how to go about iterating through that list and using each value in the list in a where_clause in a "FeatureClassToFeatureClass_conversion" to export each unique dataset. I know there are third party developed tools to do this, but they don't seem to work due to the number of records. I've tried every single one I could find.
... View more
05-01-2013
10:19 AM
|
0
|
15
|
4850
|
|
IDEA
|
-->
It would be ideal to be able to access the Maplex Labeling Engine through Python as well.
... View more
04-24-2013
10:43 AM
|
21
|
3
|
2321
|
|
POST
|
I apologize if this seems pedantic, but a recursive function is one that calls itself, not one that loops per se. You can often (always?) accomplish the same thing either way, but one or the other may be more efficient depending on what you're doing. Loops are definitely easier to understand though. Hey Gregg, No worries, I wasn't sure if I was using the right term or not. Technically though, I think this would be recursive because the function, let's say its "CalculateField_management" for this purpose, does need to call itself again after every complete iteration. So once it calculates the field for record 2, it needs to go back and caluclate the field for record 1 again. Once it calculates the field for record 3, it needs to go back and calculate the field for records 1 and 2 again...then go on to calculate the field for record 4 and again go back and calculate the field for records 1, 2 and 3 again...over and over and over again until it's calcuated the field for the very last feature and then gone back and calculated the field for every feature before the last feature. It makes my brain hurt just thinking about it. I'm sorry I cant reveal more about the function, but it's proprietary. There are 957 features in the feature class, so I'm trying to avoid having to code 956 loops, if that makes sense. So, I think this is recursive based on my understanding of recusion...perhaps not, it wouldnt be the first time I was wrong.
... View more
04-19-2013
08:42 AM
|
0
|
0
|
1977
|
|
POST
|
Thanks for the reply Chris, This is a great start. I'll give it a go today and see how far I get... Much appreciated!
... View more
04-19-2013
06:11 AM
|
0
|
0
|
1977
|
| Title | Kudos | Posted |
|---|---|---|
| 1 | 06-11-2015 12:02 PM | |
| 2 | 02-04-2016 02:35 PM | |
| 1 | 04-11-2017 12:51 PM | |
| 1 | 08-07-2015 11:00 AM | |
| 4 | 06-19-2015 01:44 PM |
| Online Status |
Offline
|
| Date Last Visited |
11-11-2020
02:23 AM
|