|
IDEA
|
There are a few other things described in the comments, but the main point of the Idea is the support of transfering domain descriptions from the input to the output KML/KMZ. This issue will be resolved in the ArcGIS Pro 3.3 release, now in Beta. The Layer To KML tool now honors the choice in the Transfer Domain Descriptions environment https://pro.arcgis.com/en/pro-app/latest/tool-reference/environment-settings/transfer-domain-descriptions.htm . So you will need to apply that setting to get the description values in the output KML.
... View more
04-08-2024
03:05 PM
|
0
|
0
|
2598
|
|
IDEA
|
In ArcGIS Pro 3.3, the "first" in the one to first join will be the record with the matching field value with the lowest Object ID. Additional sorting may be provided in a future release, based on the Idea https://community.esri.com/t5/arcgis-pro-ideas/join-control-what-related-record-gets-used/idi-p/1237557
... View more
04-08-2024
03:01 PM
|
0
|
0
|
4699
|
|
IDEA
|
The "Join one to first" Join Operation has been developed for Add Join in ArcGIS Pro 3.3, available in Beta now.
... View more
04-08-2024
02:59 PM
|
0
|
0
|
4701
|
|
IDEA
|
This functionality is delivered in ArcGIS Pro 3.2. You can use a choicelist of domain values in the Insert Values portion of the field calculator expression builder control.
... View more
04-04-2024
10:18 AM
|
0
|
0
|
2490
|
|
IDEA
|
04-03-2024
11:28 AM
|
0
|
0
|
2328
|
|
IDEA
|
Hi, The Metadata editor in Pro 3.3 includes an interactive table that lists the Geoprocessing History/lineage stored in the dataset's metadata, which you can delete or export. So you can choose to use the interactive experience, or for automation use the arcpy.metadata module. There isn't really a position for a geoprocessing tool given these alternatives.
... View more
03-25-2024
04:40 PM
|
0
|
0
|
3240
|
|
IDEA
|
Automated metadata management in ArcGIS Pro uses the arcpy.metadata module, rather than a geoprocessing toolbox/toolset. https://pro.arcgis.com/en/pro-app/latest/arcpy/metadata/what-is-the-metadata-module.htm To remove geoprocessing history/lineage from a dataset's metadata: md = arcpy.metadata.Metadata(path_to_dataset)
md.deleteContent('GPHISTORY')
md.save() And to make it so when you run tools it is not saved in the metadata, use this Geoprocessing Option
... View more
03-22-2024
05:52 PM
|
0
|
0
|
3304
|
|
POST
|
Hi @VinceE , @DanPatterson It's just a matter of prioritization. It will take many releases to update all tools that take an input field to support new field types and objects. This has not happened automatically for all geoprocessing tools in the first release of the new fields (3.2). Can you please post an enhancement idea for Sort to support the new temporal field type(s) in the Pro Ideas page, so that we can gauge user demand and prioritize accordingly? Thanks, Drew
... View more
03-22-2024
09:27 AM
|
3
|
2
|
1841
|
|
POST
|
@LMedeirosUI can you please email me the crash dmp file you are experiencing? You can find it in this folder: C:\Users\<your user name>\AppData\Local\ESRI\ErrorReports with a timestamp matching when the crash occurred. Email to dflater@esri.com. It would be great if you would contact Esri Support to log the issue with your reproducible steps, so that the development team can work to fix the crash and any problems. As a workaround, you could try to use different tools, like Calculate Field instead of Convert Temporal Field, which offers Python-based functionality for adjusting or converting temporal field values. I won't know what tool is causing the crash until reviewing the crash dmp file using our development tools.
... View more
03-20-2024
03:43 PM
|
0
|
2
|
1895
|
|
IDEA
|
@EDUARDOARDILARINCON this is being resolved as a BUG in Pro 3.3. The calculation is fixed to match the documented two decimal places for seconds and three decimal places for minutes. DDD° MM' SSS.ss" DDD° MM.mmm' So no manual rounding will be required starting in 3.3. This request was made by many users who wanted the rounded value, and no one seems to need an option for the full 6+ digits, so to keep the tool from having yet another parameter, the seconds and minutes decimals are simply being corrected to match the documentation.
... View more
03-14-2024
10:52 AM
|
0
|
0
|
2985
|
|
IDEA
|
The Calculate Field expression control already grows up to 5 lines to show longer expressions. @Anne-MarieDubois comment on the other issue is about Calculate Value which is a ModelBuilder utility tool. I can confirm that the Calc Value expression control does not exhibit the same growing behavior.
... View more
03-11-2024
12:43 PM
|
0
|
0
|
1823
|
|
POST
|
@Bud you can change the Metadata Style to anything except the default Item Description in order to see the geoprocessing history that is stored in the dataset metadata. https://pro.arcgis.com/en/pro-app/latest/help/metadata/view-and-edit-metadata.htm#ESRI_SECTION1_0FAB123C7C3C4CD49894272A899490ED
... View more
03-08-2024
01:56 PM
|
2
|
0
|
2553
|
|
POST
|
Hi @Bud was that dataset derived by running a system geoprocessing tool or a custom ModelBuilder model or Python script tool? Has that dataset been used in a ModelBuilder model or Python script tool that applies the thick red line and labeling? I am asking, because for a number of system geoprocessing tools and custom ModelBuilder or Python script tools that define the symbology of the output parameter, the geoprocessing framework stores a layer file representing that symbology and other layer properties in the dataset metadata so that if the dataset is added to another map, the symbology and layer properties are consistent to when the tool was originally run. This system was put in place early in Pro 2.x. The goal was that the result of an analysis often has a specific symbology and layer properties that are necessary for the result to be interpreted and understood, and when that info was not stored with the dataset, only the layer produced by running the tool would have the full information necessary for the result to make sense. It's been several releases at least since the Geoprocessing framework has done this. I am not familiar with the thick red line with labeling, so that's why I'm asking about custom model or script tools. A way to be sure the layer file has been stored in the dataset metadata is by exporting the metadata to an xml file and look for the LayerFile tag. Right click the dataset in Catalog and select View Metadata. In the Catalog ribbon tab, in the Metadata group, click Save As>Save As XML>All Content. Use the browse dialog to select a location and name for the xml file. The xml file will look something like below including the LayerFile tag (I highlighted in Purple - also the XML has been "pretty printed" for easier readability). When this LayerFile is present in the dataset metadata, when the mapping system adds that dataset to a map, the LayerFile properties are automatically applied to the layer. If you find the LayerFile and do not to keep this information, you can delete that part of the XML, save the file, then back in the Catalog, select the dataset, and in the Catalog ribbon tab, in the Metadata group, click Import and browse for the XML file that you saved with the cleaned out LayerFile. If you can find what system or custom processes have been run using that dataset (the XML metadata also contains a useful lineage tag), we can then find out why the LayerFile had been embedded in the dataset metadata in the first place.
... View more
02-21-2024
11:58 AM
|
3
|
4
|
5440
|
|
IDEA
|
These concerns about the tbx format are heard and acknowledged. The binary format of tbx means that there are opportunities for persistence streams to become corrupt while saving changes into the tbx, and these corrupted binary streams are almost always unrecoverable. There are not tools that can be used to understand problems related to toolbox corruption or unexpected things like the toolbox having empty contents. For these reasons and more, a new toolbox format, atbx, has been developed and Pro 3.0+ uses this new format by default. The atbx format is a zip archive containing json and other text files that define the toolbox properties, and tools inside the toolbox. The atbx format is less likely to become corrupt. If you encounter a corrupt atbx, you can extract the contents like a zip file, copy and paste the files and folders inside the extracted directory into a new folder, and re-archive as a zip then rename to atbx. This may help you recreate a new toolbox from the contents of a broken atbx. At this time there is no plan for a public utility or process for fixing or retrieving content from corrupt or broken toolboxes, in either the tbx or atbx format. But we are taking steps to make the atbx format and its specification more public and interoperable with other systems, and it is infinitely more able to be recovered than tbx due to the new storage format.
... View more
02-14-2024
02:37 PM
|
0
|
0
|
2478
|
|
IDEA
|
This idea has not received further comments or kudos since it was added a few years ago, so it will not be prioritized in our development plans. The suggestion from @DuncanHornby to use ModelBuilder with an iterator and the Alter Field tool is really good. You can build a model that takes the alias table (1) as input, iterates (2) through each row which is a pair of one field name and the new alias, gets the existing field name (3) and new alias value (4) from the table row, and passes this into the Alter Field tool (5) to perform the alias update. You can see that my NewFieldAlias values have been used correctly and the layer has been updated with the new aliases (6).
... View more
02-14-2024
02:23 PM
|
0
|
0
|
922
|
| Title | Kudos | Posted |
|---|---|---|
| 3 | 03-22-2024 09:27 AM | |
| 2 | 03-08-2024 01:56 PM | |
| 3 | 02-21-2024 11:58 AM | |
| 1 | 05-09-2023 02:24 PM | |
| 3 | 02-27-2023 05:23 PM |
| Online Status |
Offline
|
| Date Last Visited |
Wednesday
|