IDEA
|
When exporting to Excel, I would like the option of automatically resizing all the columns to fit the existing values. Similar to selecting the entire sheet in Excel and auto fitting the column widths. There would be a max column width to keep the columns manageable.
... View more
a week ago
|
2
|
2
|
156
|
IDEA
|
When using Table to Excel: I would like the option of exporting the data as a true table in the Excel sheet. The equivalent of hitting CTRL+T in Excel.
... View more
a week ago
|
1
|
0
|
113
|
IDEA
|
When using Table to Excel: I would like the option of defining the datatype/formatting of the fields. For example, lengthy ID numbers sometimes get formatted as scientific notation in Excel. I want to format those numbers as integers (numbers with zero decimal places) or as text. I want to define the formatting from within the GP tool or model, not after the fact in Excel.
... View more
a week ago
|
1
|
0
|
98
|
IDEA
|
When using the Table to Excel geoprocessing tool, the Output Excel File parameter is used as both the file name and the sheet name. I would like the option of defining a sheet name that is different from the file name.
... View more
a week ago
|
1
|
0
|
75
|
IDEA
|
In the Catalog pane > Export context menu, add an option to Export to Excel. Exporting to Excel is a common workflow. A Catalog option would be more convenient than searching for the Table To Excel tool.
... View more
a week ago
|
1
|
0
|
73
|
IDEA
|
ArcGIS Pro 2.9.5 I have several FCs. I want to export their attribute tables to a single Excel spreadsheet file (as separate sheets within the same file). The Table To Excel geoprocessing tool creates a new Excel file for each table export. Can Pro be enhanced to allow exporting a table to an existing Excel file? Copying from the attribute table and pasting it into a new sheet in Excel works for some use cases. However, it is very slow for large tables and seems less robust than a formal GP tool. I'm not sure how copying works behind the scenes in Windows 10, but it might not be ideal to have that much data stored in the clipboard in RAM.
... View more
a week ago
|
1
|
2
|
153
|
POST
|
Sometimes I use MIN(OBJECTID). Edit: CAST(MIN(OBJECTID) AS INT) But I prefer to use Oracle’s ROWNUM pseudocolumn. I imagine SQL Server has an equivalent mechanism. I normally need to do this: CAST(ROWNUM AS INT) AS ROWNUM_ so that Pro will recognize the number as an integer instead of a double.
... View more
a week ago
|
2
|
0
|
92
|
IDEA
|
...I currently am having a major struggle getting views to work in a file geodatabase at all... Yeah. I've had a lot of issues with FGDB views too. I often forget that it's possible to register an FGDB view with the geodatabase; they're not registered by default. But I don't know if that solves many problems or not; I haven't tested it enough. I talk about some FGDB database view challenges in these posts: Selecting the most recent records based on unique values in another field Select maximum values in Select By Attributes (greatest n per group) I remember finding it very helpful to put the data into a mobile geodatabase and use a SQL client like DBeaver to write my SQL queries. Then, once I'd figured out the SQL, I'd export the data to an FGDB and create an FGDB database view using the SQL I'd written for the mobile geodatabase. That's clunky, but it's better than writing FGDB SQL in the Create Database View geoprocessing tool SQL parameter.
... View more
a week ago
|
0
|
0
|
69
|
IDEA
|
As a workaround: We can try hiding/embedding the SQL expression using Make Query Table. The SQL expression behaves like a pre-filter, applied before the join is performed, which is what we want, unlike definition queries. https://community.esri.com/t5/arcgis-pro-ideas/consistent-visible-sql-expressions-and-query/idc-p/13... But we have to be careful not to forget about that SQL expression because it isn't visible or modifiable anywhere. I find it helps to describe the SQL expression in the layer's name.
... View more
2 weeks ago
|
0
|
0
|
45
|
IDEA
|
If the join table has a definition query, the Keep all input records parameter will have no effect. Manually updating the definition query by appending or OBJECTID is null can fix this if appropriate. https://pro.arcgis.com/en/pro-app/latest/tool-reference/data-management/add-join.htm
... View more
2 weeks ago
|
0
|
0
|
46
|
IDEA
|
If the join table has a definition query, the Keep all input records parameter will have no effect. Manually updating the definition query by appending or OBJECTID is null can fix this if appropriate. https://pro.arcgis.com/en/pro-app/latest/tool-reference/data-management/add-join.htm
... View more
2 weeks ago
|
0
|
0
|
38
|
IDEA
|
The Join Operation parameter has three states to adjust the cardinality. The default is blank and will allow the data source to attempt a one-to-many join. The Join one to many option will work only on specific data sources that have an Object ID field. The Join one to first option will use the first match in the table, which may result in different outputs if the Object ID field is changed or the workspace the table is copied to changes. One-to-first joins are not case sensitive; one-to-many joins are case sensitive. https://pro.arcgis.com/en/pro-app/latest/tool-reference/data-management/add-join.htm
... View more
2 weeks ago
|
0
|
0
|
98
|
IDEA
|
Select by Attributes is not supported for 1:M joins when there are duplicate OBJECTIDs in the attribute table. The selection doesn’t work as expected; incorrect rows are selected. Select By Attributes on joined data — For rows that are 1:M, all rows in the join table get selected, despite selection criteria That limitation is currently unfixable. Instead of confusing users with wrong selection results, I’m wondering if it would be safer to disable Select By Attributes in this specific scenario — when the Duplicates warning is present in the attribute table.
... View more
2 weeks ago
|
3
|
0
|
86
|
Title | Kudos | Posted |
---|---|---|
2 | a week ago | |
2 | Tuesday | |
2 | Tuesday | |
1 | a week ago | |
1 | a week ago |