IDEA
|
I had proposed a similar idea for the same reasons under Feature-class-to-feature-class-options-to-preserve-ObjectID-and-GlobalID. I agree there are numerous situations that make adding and maintaining a field in the source FC to preserve a static unique ObjectID value impossible. And a field like that is difficult to set up in the source even when it is possible when it needs to be continuously maintained or needs to avoid having other unintended consequences to other behaviors like Editor Tracking.
... View more
11-01-2024
07:21 AM
|
0
|
0
|
125
|
POST
|
I appreciate all of the suggestions. I have not tried the duplicate parcels tool yet, but if it does not give an error it sounds like that would be a possible workaround. The original parcel fabric had Editor Tracking disabled on all feature classes. The new parcel type I created had Editor Tracking enabled. I have enabled Editor Tracking on the records, parcel fabric adjustment feature classes, connection lines and points and the original parcel type polygons and polylines. I tried the Change Parcel Type tool again and it worked.
... View more
10-29-2024
09:28 AM
|
0
|
0
|
205
|
POST
|
ArcGIS Pro 3.2 Parcel Fabric version 5. I am trying to use the Change Parcel Type tool in the Parcel Record Workflows tab. The parcel fabric is from another organization and at some point was converted from an ArcMap parcel fabric into the current Pro parcel fabric. It has several overlapping parcel subtypes in a single parcel type and I want to move them into separate parcel types that would not overlap. As far as I can see all of the standard fields created by the parcel fabric are in both the polygon and polyline feature classes of the source parcel type. I added a new Parcel Type and copied all of the custom fields from the source Parcel Type in both the polygons and polylines feature classes into the new Parcel Type. I started the editor and selected a set of parcels that I want to change to my new parcel type. I then clicked the Change Parcel Type tool and it brought up the dialog showing both of my parcel types. I selected the new target parcel type and pressed Run, but it immediately fails and gives me an error saying "Dataset has missing required fields". Has anyone seen this error? Are there any suggestions on the steps I should follow to troubleshoot this? Or suggested workarounds if I cannot get the Change Parcel Type tool to work?
... View more
10-28-2024
02:34 PM
|
0
|
4
|
276
|
IDEA
|
You should also check out these two ideas for other issues related to the Transfer Attributes tool. Please up vote them if you agree with the issues I raise. Attribute Transfer Tool Equivalency in ArcGIS Pro Pro Attribute Field Mapping Set Up Problems
... View more
08-08-2024
03:52 PM
|
0
|
0
|
457
|
IDEA
|
I am no longer authorized to contact Esri Support in my organization, since I retired and now work through the Temporary Assistance Program. However, I know someone who can submit a case and I will work with them to do that. However, I will be upgrading to ArcGIS Pro 3.2 next week. So I will wait until I can confirm that this issue has still not been addressed in that version before submitting a case.
... View more
08-08-2024
03:45 PM
|
0
|
0
|
461
|
IDEA
|
The Attribute Transfer Tool Field Mapping can be set up to transfer the Source Shape field over to the Target Shape field. In ArcGIS Pro 2.9 at least, when the Source Feature's polygon or polyline Shape contains true curves those true curves are not transferred to the Target Feature's Shape. Instead, the Transfer Attribute tool transfers a Densified curve made up of straight line segments. Here is an example of a feature containing numerous true curves segments within its geometry. If I use the Attribute Transfer tool to transfer the Shape of this feature back to itself, the feature is densified. This should not be occurring. In ArcMap, true curves in the source geometry are transferred to the target feature's geometry when the Transfer Attributes tool is used. Below is the same Attribute Transfer set up in ArcMap and the result of transferring the shape of the same feature with true curves back to itself. When I transfer the above features shape to another feature it retains any true curves that came from the source. ArcMap does not allow transfers from a feature back to itself. This difference may be required in Pro since the Attribute Transfer tool Field Mapping also controls ArcGIS Pro's Copy/Paste behavior, which I have discussed in another post. The Attribute Transfer tool in ArcGIS Pro needs to be made equivalent to the ArcMap version in its behavior of preserving True Curves. This is in addition to numerous other behavior differences that I have described in other posts.
... View more
08-08-2024
11:46 AM
|
3
|
4
|
499
|
IDEA
|
For many Polyline feature class data models a pair of fields or a set of four fields hold values that are dependent on the directional orientation of the line. Examples of direction dependent pairs of fields include fields describing the left and right side of the line, such as the LEFT_FID and RIGHT_FID fields in the output of the Polygon to Line tool when the keep neighboring polygon information option is checked and Road Centerlines address fields holding city and zip code values for the left and right side of the line. Field pairs can also be associated with the From and To ends of the line, which are direction dependent, such as fields for the X and Y values defining the ends of the lines. An example of a set of four fields that are direction dependent upon both the left and right side of the line and the from and to ends of the line include Road Centerline fields for the Left From, Left To, Right From and Right To house number values of address ranges. A common problem arises when for any reason the directional orientation of the geometry of all or part of these lines needs to be flipped using the Flip Lines geoprocessing tool or the Flip modify feature editing tool and any of these direction dependent values needs to be flipped to match the new orientation. There is no easy out of the box way to do that. Using a geoprocessing approach to this problem involves creating either temporary fields or a temporary joinable copy of the table records to perform the flip/swap of these attributes, since the values of at least one of the fields needs to be temporarily preserved in a location that is separate from the pair of fields being flipped/swapped. If a temporary field is created, three separate field calculations in a very specific order have to be performed to accomplish the flip/swap of just one pair of field values. If a temporary joinable copy of records to be flipped is made, two calculations need to be done for each pair of fields. And the selection of features that had their geometry flipped must be preserved at every step in order for the whole process to be done successfully. This entire problem is ideal for a tool that would include a field pair mapping parameter, like the ArcGIS Pro Calculate Geometry tool. And numerous geoprocessing tools have been specifically designed to handle storage of data in memory or in temporary workspaces to remove the burden of managing these operations from the user. This interface would let the user perform all value flip/swap operations in a single run and be so much easier to set up and execute than anything else currently offered. A sample field mapping for performing a Swap is shown below: Swap Field 1 Swap Field 2 LEFT_FID RIGHT_FID LEFT_FROM_NO RIGHT_TO_NO LEFT_TO_NO RIGHT_FROM_NO LEFT_CITY RIGHT_CITY LEFT_ZIP RIGHT ZIP FROM_X TO_X FROM_Y TO_Y The tool would move every value held in one of these pairs of fields into either memory or a temporary workspace and then perform the swap of values between the two fields. So, for example, for every record the LEFT_FID, LEFT_FROM_NO, LEFT_TO_NO, LEFT_CITY, LEFT_ZIP, FROM_X and FROM_Y values would first be temporarily stored in memory. Then move the RIGHT_FID value to the LEFT_FID field, the RIGHT_TO_NO to the LEFT_FROM_NO field, the RIGHT_FROM_NO value to the LEFT_TO_NO field, the RIGHT_CITY value to the LEFT_CITY field, the RIGHT_ZIP value to the LEFT_ZIP field, the TO_X value to the FROM_X field and the TO_Y value to the FROM_Y field. Next, move the LEFT_FID memory value to the RIGHT_FID field, the LEFT_FROM_NO memory value to the RIGHT_TO_NO field, the LEFT_TO_NO memory value to the RIGHT_FROM_NO field, the LEFT_CITY memory value to the RIGHT_CITY field, the LEFT_ZIP memory value to the RIGHT_ZIP field, the FROM_X memory value to the TO_X field and the FROM_Y memory value to the TO_Y field. Finally, perform any memory and temporary workspace clean-up for the user. This whole set of operations being done one step at a time by the user is prone to errors, since it involves a minimum of either 16 steps or 21 steps all of which involve working with very similar field names that can easily be confused or difficult for the user to keep track of. Even using the Calculate Fields tool, which has a field mapping, it can be confusing, because the operation involves a minimum of three separate steps that each involve numerous field with very similar names that have to be performed in an exact order. The interface set up above is much easier to understand than any manual workflow I could describe and once the tool is developed and debugged these operations become standardized and will be performed infallibly and with easy repeatability once the user configures the field map correctly for a given Polyline feature class. This tool would naturally follow or precede the use of the Flip Lines geoprocessing tool as a separate step for the same feature selection. Having this tool in combination with the Flip tools would allow the user to make simple modifications to the field map if they only want to swap a subset of these attributes because they know that the other attributes are oriented correctly with the flipped geometry already and may have been the cause of the geometry flip in the first place.
... View more
08-06-2024
07:23 AM
|
0
|
0
|
267
|
IDEA
|
Additionally, if the field map does support ObjectID preservation, please describe the set up steps, since just checking the ObjectID field in the field map does not reliably work in my experience. For example, checking the ObjectID field in the field map of the Dissolve tool completely fails to preserve the actual unique ObjectID values of the source. Instead it behaves like I set it to use the Min or First option rather than to get the Unique values of the ObjectID. The Dissolve tool output seems to generate multiple features as though it was trying to preserve the Unique values, but at the end multiple records have duplicate values in all of the fields because the real unique ObjectID values were not preserved.
... View more
08-05-2024
01:28 PM
|
0
|
0
|
337
|
IDEA
|
I was using the ArcMap tool when I wrote this post and I am using ArcGIS Pro which apparently predates the release of the Export Features and Export Tables tools. So at this time I cannot test the Export Features or Export Tables tools. However, ArcGIS 2.9 does offer the "Preserve Global IDs" environmental setting and that does work with Feature Class to Feature Class and Table to Table at 2.9. So that aspect of my request can be considered Implemented. However, the bulk of the data I deal with only uses ObjectID as the primary key unique record value. Preserving these values is paramount to many of my workflows. I do note that the Export Features and Export Table tools include a field map, which may allow me to preserve the ObjectID. However, other tools that include a field map have been notoriously unsuccessful and unreliable at preserving the ObjectID values up through ArcGIS Pro 2.9. So, I cannot test if this is a viable solution, since the Field Map is not available with Feature Class to Feature Class or Table to Table. I do not foresee upgrading to ArcGIS Pro 3.x any time soon in my organization, although I may push to upgrade my own ArcGIS Pro installation sooner if you can confirm the Export Features and Export Table field map offers ObjectID value preservation. If you can confirm the Field Map capability of the Export Features and Export Table tools support ObjectID preservation, then you can mark this idea as Implemented. But if it does not preserve the ObjectID foreign key values reliably, then consider implementing this aspect of this idea through the field map or an an additional option.
... View more
08-05-2024
11:58 AM
|
0
|
0
|
347
|
POST
|
The original suggestion using MOD only reliably works if the field being evaluated is a Short, Long or Big Integer field. However, if the field being evaluated is a Float or Double and contains any values that are not whole numbers, the expression suggested by @MatthewLeonard is the way to go.
... View more
08-03-2024
09:18 AM
|
1
|
1
|
587
|
IDEA
|
Being able to retain foreign key values in any extract of data that the user wants to create that match the source primary key values is fundamental to most data administration workflows. Extractions of records from a source that relies on its OBJECTIDs and/or its GLOBALIDs as its primary key in relationship to other records should not be relationally destroyed relative to the source that they came from if the user has an explicit requirement to preserve those values as a foreign key in their extracted output. However, this kind of relational destruction is currently always happening whenever the Feature Class to Feature Class and Table to Table tools are used, which are the default tools for creating extractions in ArcGIS Pro. The Feature Class to Feature Class and Table to Table tools (and possibly other tools) need to add two options that would allow the user to create new foreign key fields that preserve the OBJECTID and/or GLOBALID values of the source feature class/table in the output. If the user checks either of these options a control in the tool should suggest a default Field Name for each which the user could override. Validation should ensure that the source contains OBJECTID/GLOBALID fields before enabling these options, and that the user definable field names are valid and not already in the source feature class/table or duplicated if the user checks both options. The source OBJECTID values would be preserved in a Long field and the source GLOBALID values would be preserved in a GUID field. The use case for this is when the only field(s) that contain a unique value for each of the source features or rows is the OBJECTID and/or the GLOBALID. Often the source table is not owned by the user and cannot be modified to store these values into a Long and/or GUID field in advance. Using either of these tools overwrites the source OBJECTID and GLOBALID values with new values in the output, especially when the user only wants to extract a subset of records. Having to copy the source data into a workspace the user can modify to add these fields in advance is very time consuming when the source contains large numbers of features and/or rows and they have to repeatedly copy the source every time they need a new extraction if they have any reason to believe that the source contains new records that they want. This Copy workaround is especially burdensome to the user who only wants a small subset of the source features or rows and not an entire copy to apply the fundamentals of relational database design with their output. There are also many instances when the Copy function is not supported for even doing this workaround when the source feature class is part of a topology within a feature dataset. Incorporating the preservation of the OBJECTIDs in a static foreign key field in the output of these two tools is absolutely possible, as demonstrated by numerous tools like Spatial Join, Union, Intersect, Polygon to Line, Multipart to Singlepart, etc. all of which preserve a foreign key FID field in their output. Expending this kind of foreign key preservation behavior to also allow for the preservation GLOBALID values in a GUID foreign key output field should not be significantly more difficult. This would be a great enhancement and time saver for users who need extractions from Parcel Fabric data sources and any other advanced feature dataset capabilities that are built around GLOBALIDs as primary keys for relationships. And adding a GLOBALID to GUID option to many other tools that currently by default preserve OBJECTIDs in a foreign key FID field would be great as well as a time saver. But that is less critical, since at least those outputs already preserve usable foreign key values from the source OBJECTIDs in their outputs by default. Additionally some of those tools support field mapping, which may allow the user to accomplish the GLOBALID value preservation themselves. But the Feature Class to Feature Class and Table to Table tools currently do not offer any option that allows the user to preserve any usable foreign key values in their output relative to any sources that rely on OBJECTIDs or GLOBALIDs as their sole unique primary keys. If the existing Feature Class to Feature Class and Table to Table tools should not be modified to add this option because they are accessible to all license levels and intentionally provide limited functionality as a result, a new pair of Conversion tools called Feature Class to Feature Class with FIDs and Table to Table with FIDs should be added for Advanced License users at minimum. These two new tools would definitely be a selling point for many Basic and Standard users to consider upgrading to an Advanced license, as well as two very befitting and useful additions for truly Advanced users.
... View more
08-02-2024
03:59 PM
|
0
|
3
|
446
|
IDEA
|
In my opinion, the centralization of the Attribute Field Mapping configuration into a single set up interface that simultaneously impacts virtually all editing tools is a fundamentally flawed design. It introduces confusion, contradiction, conflicts, constraints and inefficiency into my workflows that never existed before. It seems designed to only work if I repeatedly use only one of the tools that relies on this set up and rarely switch to any other tool that is impacted by the set up. While I can imagine there are some users that think the centralized Field Mapping set up is an improvement, I doubt they have developed many workflows that switch between the different tools that are impacted by these settings. I think anyone that thinks about the behaviors of the different editing tools understands why ArcMap never centralized them into a single set up. The Attribute Transfer tool set up is designed around the spatial behavior of that tool, which is best used with only a very small set of field mappings in place to work correctly, especially when the most common behavior involves transferring data between overlapping features in two different feature classes. On the other hand the Copy/Paste behavior needs nearly every attribute mapping set up by default, especially for copy/pasting records within the same feature class/attribute table. But the Copy/Paste Field Mapping set up means I have to turn off the layer covering the other to use the Attribute Transfer to click between the two feature classes, adding a lot of clicks that are not necessary in ArcMap. On the other hand my ideal Attribute Transfer field mapping set up would cause the Copy/Paste behavior to fail to preserve attributes in Pro. I never had to think about that in ArcMap while setting up the Attribute Field Map for the Transfer Attribute Tool in ArcMap. The two behaviors were never meant to control each other, but now. like it or not, I am being forced into an unwanted choice with unintended impacts on both behaviors. Using the ArcGIS Pro Attribute Field Mapping now feels like trying to configure the separate behaviors of four traffic signal arms where I can only set up one of them and every change I make to one signal arm is being applied to all of the signal arms simultaneously and identically. The result is not the smooth operation of traffic flow through that intersection, but confusion, collisions and a tremendous slow down of traffic flow once the drivers realize the signals arms are all showing green, yellow or red at the same time and not operating independently and in concert with each other. While playing from a single sheet of music may be fine for a solo performance, no symphony worth listening too was ever performed by giving all the players of the different instruments the same sheet of music to play at the same time. The Copy/Paste, Attribute Transfer tool, and any other editing tools impacted by the Attribute Field Mapping settings are symphony players. They need to be given separate sheets of music to play a symphony together. The Attribute Field Mapping is their sheet music. They are rarely meant to simultaneously play the melody, harmony or rhythm identically when they are played together. And you can't get them to play together by having only one sheet of music available for all of the instruments to read from at any given moment. I want the field maps of the editing tools able to be separated again. The superficial fact that the user interface appearance required to set up the Field Mappings of the different editing tools can appear identical did not justify connecting all of the editing tool behaviors that use a Field Mapping to a single centralized model underneath. Adding Field Mapping capabilities to the Copy/Paste function is an improvement in so far as it lets me map between different field names for that operation, which I couldn't do with ArcMap. However, that set up should not have been combined with the set up of the Attribute Transfer tool in the mistaken belief that they use this functionality in fundamentally same way.
... View more
07-18-2024
02:58 AM
|
4
|
1
|
400
|
POST
|
The problem reappeared in a template map. I found that that the coordinate system projection of the template map dataframe did not match my feature classes. It was fixed by changing the template map dataframe coordinate system. So the project was not corrupt. It is just that I am still not used to how to track down such basic things in ArcGIS Pro compared to ArcMap. It took me this long to figure out that I needed to right click the map root container at the top of the layer list to check that setting, and I only figured that out because I had two maps in the same project that were behaving completely differently. The other thing this showed me is that true curves have to be displayed in their native projection to appear and behave correctly. Attempting to display them in any other projection on the fly without formally applying the Project tool simply does not work.
... View more
07-08-2024
09:51 AM
|
1
|
0
|
642
|
POST
|
After further investigation, the strange behaviors I was observing all seems to be due to using a corrupted project. Bringing the data into a fresh project seems to have eliminated the issue with the true curves. The behavior was primarily being observed in ArcGIS Pro. I am now no longer as certain that the exact same behaviors were appearing in ArcMap, or if I was assuming they were on the basis of some behaviors that I may not have investigated as thoroughly as I thought. Bottom line, if a project begins to act strangely try building a fresh one.
... View more
07-08-2024
08:54 AM
|
0
|
0
|
652
|
POST
|
During preparations for migrating to a Parcel Fabric, I am examining the results from the Simplify by Straight Lines and Circular Arcs (SLACA) tool in some detail. I have several instances of strange results that I want to discuss with Esri technical support to see if these are expected behaviors or more likely signs of bugs. I am using ArcGIS Pro 2.9 and ArcMap 10.8.1. This is an example of an original densified polyline that makes up part of a polygon boundary shown next to the results of running the buffer tool on the same polygons with a negative 1-foot buffer (-1’). Everything is as expected. The dark red line below shows the output of the SLACA tool using a 0.5-foot distance tolerance allowance from the polyline edges. The curves generated are clearly farther than 0.5 feet away from the original polyline edges (exactly 5’ offset rather than 0.5’ offset in the most extreme case as far as I can tell) and visibly cross into the -1’ buffer polygon. However, when I use the Select by Location tool to find the set of SLACA lines that intersect with the buffer polygons, none of these lines are selected. I created a copy of the SLACA lines and ran the Densify tool using the ANGLE option to only alter curves with a vertex placement at every 0.5-degree change. The output line in purple is nearly restored to the original form of the polyline before I ran the SLACA tool when it did not intersect with the buffer polygons. This is apparently the way the radial bend/curve geometry is being interpreted by the Select by Location tool which would explain why that tool did not identify any intersection between the two features. The Densified Output Shown By Itself The Original and Densified Polylines Overlaying Each Other. When I build polygons using the SLACA lines the output is visibly identical to the SLACA lines. If I select one of those SLACA polygons (the bulb in the center of the picture) it looks like this relative to the negative buffer polygons: If I run the Select by Location tool using this selected polygon to select all intersecting buffer polygons, it only selects the buffer polygon that makes up the center of the bulb and nothing else. This would seem to indicate that there are bugs either affecting the depiction of the radial bend/curve on the screen so that it does not accurately represent the underlying geometry formula being stored with the feature or an error in the way the geometry formula is being processed. I think it is most likely that the screen depiction is not an accurate representation of the underlying geometry formula being stored with these features, but either way the disconnect between what the user sees and how the feature behaves using standard geoprocessing operations clearly will cause confusion for the viewer and the analyst (like it did for me). The good new is that apparently the fundamental true curve geometry being stored in the feature is correct within the geoprocessing tolerances I set using the geoprocessing tools and is not apparently being corrupted by any of the geoprocessing steps I did. Otherwise the Densify tool could not have created an output that met my tolerance criteria and virtually restored the original shape. But the bad news is that the depiction of the true curve is visibly distorted in a way that would lead the user to believe that the geometry has been corrupted to not match the tolerance settings. Additionally, interacting with the geometry on the screen behaves as though that distortion being depicted is really there, i.e., the user has to click on what they see on screen to identify the feature they are viewing, even though in reality the coordinates being clicked would not touch the underlying geometry. If the screen depiction and interactions were corrected to actually match the stored underlying geometry the geoprocessing tools are actually generating, this would likely resolve many of the frustrations users have expressed over the years about how true curves behave. Additionally, only a relatively small subset of curves are exhibiting this behavior. That suggest that this is only affecting one or perhaps two subroutines of the drawing module that are only triggered under specific circumstances. This would explain the seemingly random appearance of gaps or overlaps on screen in the aftermath of inputting geometry that should perfectly align, because the behavior is not universally occurring. As proof that this is the case is the fact that no gaps/overlaps were created between any true curves when I created my polygons using only a single polyline to define the shared boundary between every left and right polygon, whether the boundary contained one or more true curves or not. Artificial gaps/overlaps can only be created when separate polylines define the shared boundaries between two polygons and can be depicted using different drawing subroutines. Such randomness leads users to mistrust the results even more than an easily explained universal error that everyone understands, because it is both harder to detect/predict and harder to explain. But in reality this means to me that the overall routines involving true curves are trustworthy and the effect is actually less damaging to our actual data outputs than what we feel we are observing. The correction of this behavior should have a much greatly impact on user perceptions of the trustworthiness of true curves than it is actually having on the true curves themselves in reality. At the same time, this bug may never be fixed. The performance requirements of the Drawing module and the Geoprocessing module are so different that any resolution of this bug that results in a noticeable degradation in drawing performance will likely not be implemented. The number of users that would tolerate a noticeable drawing performance drop is certainly far less than the set of users that would continue to tolerate the production of these small sets of features with these apparent distortions and geoprocessing behavior disconnects. This geometry appears and performs the same way in ArcMap and ArcGIS Pro, so this seeming bug has apparently not been detected or has been detected and rejected from being fixed since the original incorporation and implementation of true curves in Esri's products.
... View more
07-04-2024
08:59 AM
|
0
|
2
|
742
|
Title | Kudos | Posted |
---|---|---|
1 | 05-07-2014 03:27 PM | |
1 | 08-03-2024 09:18 AM | |
3 | 08-08-2024 11:46 AM | |
4 | 07-18-2024 02:58 AM | |
1 | 07-08-2024 09:51 AM |
Online Status |
Offline
|
Date Last Visited |
a week ago
|