|
BLOG
|
The process you are describing is not something I cover in this blog. It requires an insertCuror for new records and creation of multiple dictionaries built in multiple passes to process each cursor correctly (including a deleted record dictionary if you want a true match). The updateRows.updateRow(updateRow) line also has to only occur at an indentation level that only occurs where the if condition detecting changes occurs is met and not at a level that occurs for every row. I will see if I ever posted anything elsewhere that comes closer to that approach. If you post the code you have working I could copy it and modify it to get closer to what you are describing. It is too difficult to create a post with sample code on my phone without that as a starting point. For the update part this should work to ignore nulls and only update records that do not match. # verify that the keyValue is in the Dictionary
if keyValue in valueDict:
if list(updateRow[2:]) != list(valueDict[keyValue]): # transfer the values stored under the keyValue from the dictionary to the updated fields.
for n in range (2,len(sourceFieldsList)): if valueDict[keyValue][n-2] != None: updateRow[n] = valueDict[keyValue][n-2] updateRows.updateRow(updateRow) I am not sure if the comparison line works correctly in converting the tuples to lists, so let me know if it triggers an error so that some print statements can be added to make the comparison correctly.
... View more
02-27-2025
09:32 PM
|
0
|
0
|
8631
|
|
BLOG
|
There was another error in the code caused by editing on my phone that I corrected on the last line, which should have been two separate lines of code as now shown. Also, you should be prepared to adjust and correct any indentation errors on your own in your IDE, since they can sometime be difficult to catch in the code window of the community. The IDE was highlighting the character where the line return and indentation error occurred in your screenshot.
... View more
02-27-2025
09:13 PM
|
1
|
0
|
8641
|
|
BLOG
|
I am editing using my phone, so I just had to correct some code errors that occurred due to the difficulty of navigating in that editing environment. Please recheck my post's code.
... View more
02-27-2025
08:57 PM
|
0
|
0
|
8653
|
|
BLOG
|
You should have based your code on the section entitled: Example 2 - Transfer of Multiple Field Values between Feature Classes where there is a 1:1 Match between Field Sets The section for updating fields should look like the code below to update all matched fields in the records where the value is Null. However, you should test this on a copy of your data before aplying it on any production data to verify that I aligned the field updates correctly with your other code modifications: # verify that the keyValue is in the Dictionary
if keyValue in valueDict:
# transfer the values stored under the keyValue from the dictionary to the updated fields.
for n in range (2,len(sourceFieldsList)): if valueDict[keyValue][n-2] == None: updateRow[n] = 0 else: updateRow[n] = valueDict[keyValue][n-2] updateRows.updateRow(updateRow) This code assumes you want to assign a default value of 0 if the field is Null and otherwise transfer the values of the source to the target field. I have made several revisions to the code since my initial response after considering your requirements more carefully. Hopefully, I have understood them correctly.
... View more
02-27-2025
08:27 PM
|
1
|
0
|
8664
|
|
IDEA
|
I am not having issues with the Preserve GlobalIDs setting. I only mentioned that setting in my post because it is simpler to apply than the procedure you have outlined for preserving ObjectID values. I am more inclined to consider the preservation of GlobalIDs setting as a truly built-in Already Offered feature over the steps required for the preservation of ObjectIDs. However, I accept that the software currently supports both aspects of my original idea provided the user is familiar with both of these settings/steps.
... View more
02-22-2025
11:09 AM
|
0
|
0
|
2072
|
|
IDEA
|
@SSWoodward I was unaware that this process was an option. I believe I had tried to use the field map in a similar way and had failed to get the expected result, but that was using the Summary/Spatial Join tools with field map capabilities and not the Export Feature tool. I will try this. It isn't as simple as the Environment option for GlobalIDs, but it would meet my needs for source data that cannot be modified to contain a long field for the ObjectID directly once I have verified it works as you have described.
... View more
02-18-2025
07:10 PM
|
0
|
0
|
2107
|
|
IDEA
|
I am using ArcGIS Pro 3.2. The Joins and Relates set up dialog performance is painfully slow to fill in the parameters. For example, I have an active map with 35 layers/Standalone tables it takes up to 2 minutes to populate the parameter drop downs listing the available layers/tables and fields. When the drop down options are available, the tool validation process virtually always resets all of my parameter choices back to the the original configuration when the tool opened approximately 4 times before it accepts my choices. It has at times forced me to choose the values I want up to 25 times for up to 4 more minutes. It seems to be stuck on validating the poorly constructed initial tool configuration it opened with which it can't satisfactorily complete before it takes any real input from me. The validation is excessive and unresponsive and its effort to autocomplete the parameters for me are all counter productive. In the case of joins, I have had to wait up to 20 minutes for the joined table to refresh after completing the join on an SDE feature class and file geodatabase table with on 35K records in each and a one-to-one relationship. All other open tableviews become disabled for table navigation and selection while the join I created is refreshing. Once the join of the 35 record tables was complete it took about 20 minutes to perform a simple selection query based on the join, even though the fields had attribute indexes. This whole process and poor performance repeatedly occurred in multiple exclusive fresh sessions of Pro. A field calculation also took 20 minutes. This was repeated with other SDE to file geodatabase joins with different record set sizes and all have performed similarly poorly at every step in the joins process. Arcmap performed the entire set of tasks in under 4 minutes. Activating a relate and getting the results also seems to take considerably longer than it does in ArcMap regardless of the size of the selection that activated the relate.
... View more
02-13-2025
07:59 AM
|
3
|
2
|
1044
|
|
IDEA
|
I had proposed a similar idea for the same reasons under Feature-class-to-feature-class-options-to-preserve-ObjectID-and-GlobalID. I agree there are numerous situations that make adding and maintaining a field in the source FC to preserve a static unique ObjectID value impossible. And a field like that is difficult to set up in the source even when it is possible when it needs to be continuously maintained or needs to avoid having other unintended consequences to other behaviors like Editor Tracking.
... View more
11-01-2024
07:21 AM
|
0
|
0
|
633
|
|
POST
|
I appreciate all of the suggestions. I have not tried the duplicate parcels tool yet, but if it does not give an error it sounds like that would be a possible workaround. The original parcel fabric had Editor Tracking disabled on all feature classes. The new parcel type I created had Editor Tracking enabled. I have enabled Editor Tracking on the records, parcel fabric adjustment feature classes, connection lines and points and the original parcel type polygons and polylines. I tried the Change Parcel Type tool again and it worked.
... View more
10-29-2024
09:28 AM
|
0
|
0
|
1468
|
|
POST
|
ArcGIS Pro 3.2 Parcel Fabric version 5. I am trying to use the Change Parcel Type tool in the Parcel Record Workflows tab. The parcel fabric is from another organization and at some point was converted from an ArcMap parcel fabric into the current Pro parcel fabric. It has several overlapping parcel subtypes in a single parcel type and I want to move them into separate parcel types that would not overlap. As far as I can see all of the standard fields created by the parcel fabric are in both the polygon and polyline feature classes of the source parcel type. I added a new Parcel Type and copied all of the custom fields from the source Parcel Type in both the polygons and polylines feature classes into the new Parcel Type. I started the editor and selected a set of parcels that I want to change to my new parcel type. I then clicked the Change Parcel Type tool and it brought up the dialog showing both of my parcel types. I selected the new target parcel type and pressed Run, but it immediately fails and gives me an error saying "Dataset has missing required fields". Has anyone seen this error? Are there any suggestions on the steps I should follow to troubleshoot this? Or suggested workarounds if I cannot get the Change Parcel Type tool to work?
... View more
10-28-2024
02:34 PM
|
0
|
4
|
1539
|
|
IDEA
|
You should also check out these two ideas for other issues related to the Transfer Attributes tool. Please up vote them if you agree with the issues I raise. Attribute Transfer Tool Equivalency in ArcGIS Pro Pro Attribute Field Mapping Set Up Problems
... View more
08-08-2024
03:52 PM
|
0
|
0
|
1465
|
|
IDEA
|
I am no longer authorized to contact Esri Support in my organization, since I retired and now work through the Temporary Assistance Program. However, I know someone who can submit a case and I will work with them to do that. However, I will be upgrading to ArcGIS Pro 3.2 next week. So I will wait until I can confirm that this issue has still not been addressed in that version before submitting a case.
... View more
08-08-2024
03:45 PM
|
0
|
0
|
1469
|
|
IDEA
|
The Attribute Transfer Tool Field Mapping can be set up to transfer the Source Shape field over to the Target Shape field. In ArcGIS Pro 2.9 at least, when the Source Feature's polygon or polyline Shape contains true curves those true curves are not transferred to the Target Feature's Shape. Instead, the Transfer Attribute tool transfers a Densified curve made up of straight line segments. Here is an example of a feature containing numerous true curves segments within its geometry. If I use the Attribute Transfer tool to transfer the Shape of this feature back to itself, the feature is densified. This should not be occurring. In ArcMap, true curves in the source geometry are transferred to the target feature's geometry when the Transfer Attributes tool is used. Below is the same Attribute Transfer set up in ArcMap and the result of transferring the shape of the same feature with true curves back to itself. When I transfer the above features shape to another feature it retains any true curves that came from the source. ArcMap does not allow transfers from a feature back to itself. This difference may be required in Pro since the Attribute Transfer tool Field Mapping also controls ArcGIS Pro's Copy/Paste behavior, which I have discussed in another post. The Attribute Transfer tool in ArcGIS Pro needs to be made equivalent to the ArcMap version in its behavior of preserving True Curves. This is in addition to numerous other behavior differences that I have described in other posts.
... View more
08-08-2024
11:46 AM
|
3
|
4
|
1507
|
|
IDEA
|
For many Polyline feature class data models a pair of fields or a set of four fields hold values that are dependent on the directional orientation of the line. Examples of direction dependent pairs of fields include fields describing the left and right side of the line, such as the LEFT_FID and RIGHT_FID fields in the output of the Polygon to Line tool when the keep neighboring polygon information option is checked and Road Centerlines address fields holding city and zip code values for the left and right side of the line. Field pairs can also be associated with the From and To ends of the line, which are direction dependent, such as fields for the X and Y values defining the ends of the lines. An example of a set of four fields that are direction dependent upon both the left and right side of the line and the from and to ends of the line include Road Centerline fields for the Left From, Left To, Right From and Right To house number values of address ranges. A common problem arises when for any reason the directional orientation of the geometry of all or part of these lines needs to be flipped using the Flip Lines geoprocessing tool or the Flip modify feature editing tool and any of these direction dependent values needs to be flipped to match the new orientation. There is no easy out of the box way to do that. Using a geoprocessing approach to this problem involves creating either temporary fields or a temporary joinable copy of the table records to perform the flip/swap of these attributes, since the values of at least one of the fields needs to be temporarily preserved in a location that is separate from the pair of fields being flipped/swapped. If a temporary field is created, three separate field calculations in a very specific order have to be performed to accomplish the flip/swap of just one pair of field values. If a temporary joinable copy of records to be flipped is made, two calculations need to be done for each pair of fields. And the selection of features that had their geometry flipped must be preserved at every step in order for the whole process to be done successfully. This entire problem is ideal for a tool that would include a field pair mapping parameter, like the ArcGIS Pro Calculate Geometry tool. And numerous geoprocessing tools have been specifically designed to handle storage of data in memory or in temporary workspaces to remove the burden of managing these operations from the user. This interface would let the user perform all value flip/swap operations in a single run and be so much easier to set up and execute than anything else currently offered. A sample field mapping for performing a Swap is shown below: Swap Field 1 Swap Field 2 LEFT_FID RIGHT_FID LEFT_FROM_NO RIGHT_TO_NO LEFT_TO_NO RIGHT_FROM_NO LEFT_CITY RIGHT_CITY LEFT_ZIP RIGHT ZIP FROM_X TO_X FROM_Y TO_Y The tool would move every value held in one of these pairs of fields into either memory or a temporary workspace and then perform the swap of values between the two fields. So, for example, for every record the LEFT_FID, LEFT_FROM_NO, LEFT_TO_NO, LEFT_CITY, LEFT_ZIP, FROM_X and FROM_Y values would first be temporarily stored in memory. Then move the RIGHT_FID value to the LEFT_FID field, the RIGHT_TO_NO to the LEFT_FROM_NO field, the RIGHT_FROM_NO value to the LEFT_TO_NO field, the RIGHT_CITY value to the LEFT_CITY field, the RIGHT_ZIP value to the LEFT_ZIP field, the TO_X value to the FROM_X field and the TO_Y value to the FROM_Y field. Next, move the LEFT_FID memory value to the RIGHT_FID field, the LEFT_FROM_NO memory value to the RIGHT_TO_NO field, the LEFT_TO_NO memory value to the RIGHT_FROM_NO field, the LEFT_CITY memory value to the RIGHT_CITY field, the LEFT_ZIP memory value to the RIGHT_ZIP field, the FROM_X memory value to the TO_X field and the FROM_Y memory value to the TO_Y field. Finally, perform any memory and temporary workspace clean-up for the user. This whole set of operations being done one step at a time by the user is prone to errors, since it involves a minimum of either 16 steps or 21 steps all of which involve working with very similar field names that can easily be confused or difficult for the user to keep track of. Even using the Calculate Fields tool, which has a field mapping, it can be confusing, because the operation involves a minimum of three separate steps that each involve numerous field with very similar names that have to be performed in an exact order. The interface set up above is much easier to understand than any manual workflow I could describe and once the tool is developed and debugged these operations become standardized and will be performed infallibly and with easy repeatability once the user configures the field map correctly for a given Polyline feature class. This tool would naturally follow or precede the use of the Flip Lines geoprocessing tool as a separate step for the same feature selection. Having this tool in combination with the Flip tools would allow the user to make simple modifications to the field map if they only want to swap a subset of these attributes because they know that the other attributes are oriented correctly with the flipped geometry already and may have been the cause of the geometry flip in the first place.
... View more
08-06-2024
07:23 AM
|
0
|
0
|
826
|
|
IDEA
|
Additionally, if the field map does support ObjectID preservation, please describe the set up steps, since just checking the ObjectID field in the field map does not reliably work in my experience. For example, checking the ObjectID field in the field map of the Dissolve tool completely fails to preserve the actual unique ObjectID values of the source. Instead it behaves like I set it to use the Min or First option rather than to get the Unique values of the ObjectID. The Dissolve tool output seems to generate multiple features as though it was trying to preserve the Unique values, but at the end multiple records have duplicate values in all of the fields because the real unique ObjectID values were not preserved.
... View more
08-05-2024
01:28 PM
|
0
|
0
|
2543
|
| Title | Kudos | Posted |
|---|---|---|
| 1 | 03-24-2026 08:01 PM | |
| 6 | 02-23-2026 08:34 AM | |
| 1 | 03-31-2025 03:25 PM | |
| 1 | 03-28-2025 06:54 PM | |
| 1 | 03-16-2025 09:49 PM |
| Online Status |
Offline
|
| Date Last Visited |
03-24-2026
07:54 PM
|