POST
|
Did you ever get an answer to this question? I'm curious to know what's happening as well.
... View more
06-06-2013
09:25 AM
|
0
|
0
|
2
|
POST
|
I agree. There needs to be some way to automate this. At the very least there should be some sort of xml template you can feed to the Create Address Locator tool. I'm creating a data processing model for a customer who will run the model whenever they need to update their data in the custom ArcEngine software they bought from us. We've automated the entire process for them to make it as painless as possible, except after the model is completed they will still need to manually set up the Address Locator. This is very frustrating, as we promised to deliver a set of tools that will process their data so it works for our software. Will this be addressed in 10.2?
... View more
05-01-2013
08:43 AM
|
2
|
0
|
14
|
POST
|
I am writing a custom replication script that compares each field of every row in a feature class. I am able to compare the SHAPE field in point, polyline, or polygon features using row.SHAPE.equals(other_geometry), but this won't work for annotation feature classes. The returned arcpy object for the SHAPE field in annotation feature classes is a 'passthrough' object rather than a Polygon object. It doesn't seem to have any useful attributes for comparing. Is there any way for me to determine if the SHAPE fields are different between two annotation feature class rows?
... View more
09-12-2011
11:55 AM
|
0
|
1
|
159
|
POST
|
I'm working on an arcpy python script that updates, inserts, and deletes rows in a dataset. I'm able to do everything I need the script to do using arcpy, except for preserving GlobalID values when copying a row from one feature class to another. It seems that I will need to use C# or VB.NET in order to preserve the GlobalIDs when I do inserts. The code is found here: http://help.arcgis.com/en/sdk/10.0/arcobjects_net/conceptualhelp/index.html#//00010000019v000000 I need my python script to be able to call the C# code in the above link. Basically, my python script will perform all of the updates and deletes using an UpdateCursor, then perform the inserts using the C# code. Are there any examples of arcpy scripts or tools taking this approach? I know that writing the whole thing in C# would probably be the 'best' route, but we are trying to design an infrastructure of custom python scripts and tools for our organization.
... View more
08-25-2011
11:02 AM
|
0
|
0
|
1421
|
POST
|
Thanks for the code Bruce. I looked through it and tried out the tool, but it only compares the features and returns the differences. If I were to add the Added Features returned from your tool to the "original" feature class using an InsertCursor, then each row from Added Features that is inserted into the original feature class would be generated a new GlobalID. My code does the compares fine, I just need to find a way to keep the GlobalID the same when doing inserts.
... View more
08-17-2011
12:13 PM
|
0
|
0
|
6
|
POST
|
I am creating a python script that compares two different feature classes - a parent and a child - and updates the child with whatever changes that were made to the parent. For example, all of our "prime time" data is stored in our Publication SDE gdb. We have a separate gdb that "mirrors" the Publication gdb, except it is projected to Web Mercator. Whenever a feature is added, removed, or changed in Publication, the script will detect it and make the corresponding change in the Web Mercator gdb. All of the adds, deletes, and updates are determined by comparing GlobalIDs. This works fine with deletes and updates, but I am having issues with adds. My code will correctly determine which row to insert from the input feature class to the output feature class, but when it is inserted into the output it is given a new GlobalID that doesn't match the original feature's GlobalID. I understand that the InsertCursor is technically inserting a new row, thus the new GlobalID, but what I need is a way to copy an individual row. My question is: Is there any way to copy an individual feature (row) from one location to another while maintaining the parent's GlobalID? I have only found ways of copying entire feature classes or tables. I feel like there should be a sort of 'CopyCursor'. def insertRow(outInsertCur, outDatasetPath, inDatasetPath, globalId): in_search = arcpy.SearchCursor(inDatasetPath, "GlobalID = '%s'" % globalId) insert_row = in_search.next() outInsertCur.insertRow(insert_row) print "Inserted %s into %s" % (insert_row.GlobalID, outDatasetPath) del in_search out_insert = arcpy.InsertCursor(out_dataset_path) insertRow(out_insert, out_dataset_path, in_dataset_path, add_id) What I am trying to create is similar to replication, but my organization does not want to go that route.
... View more
08-16-2011
03:39 PM
|
0
|
3
|
1016
|
POST
|
I just tested it manually and with a script, and it appears that any multi-page pdf suffers slowdown in relation to the number of pages. I tested exporting using arcpy.mapping.ExportToPDF and appending each page to a single pdf document, as well as just choosing "Export -> All Pages" to create a pdf of all pages in Data Driven Pages. Both methods result in a slower PDF than if I had combined the pages manually using Acrobat, and the higher the number of pages, the slower the PDF. I believe there is an inefficiency in the way that ArcMap creates multi-page pdfs, unless there is something I'm missing.
... View more
03-08-2011
02:26 PM
|
0
|
0
|
17
|
POST
|
I'm using a script that exports over 700 pages and combines them into a multi-page pdf. The pdf runs incredibly slow every time I export it, regardless of the file size, or even if there are fewer pages. I have tried adjusting the resolution and image quality, but it still runs slow. By "running slow" I mean that it takes just under a minute for the file to fully open, and it freezes for 20 or 30 seconds each time I scroll from page to page. The pdf runs this horribly on our other computers where I work as well, and they are all powerful machines. I tried saving it as an "Optimized" pdf in Acrobat, and although it reduced the file size by a third, it didn't improve the speed. I know that the number of pages shouldn't be the issue because we have created the same multi-page pdf in the past manually without a script, and it runs fine. Has anyone found a solution to this?
... View more
03-07-2011
12:29 PM
|
0
|
7
|
2169
|
Online Status |
Offline
|
Date Last Visited |
11-11-2020
02:23 AM
|