IDEA
|
Want to bump this up as recently I have found the need to reclassify range of pixel values into floats and was surprised to discover the reclassify tool only supports integer format as output. I have a work around but it would be good if the reclassify and reclassify by table accept floats as an output format.
... View more
3 weeks ago
|
0
|
0
|
28
|
IDEA
|
I think may be an alternative approach to this idea that achieves the same end result is to continue allowing tools to export with metadata but have a tool (and a UI button) that clears the metadata to a default "nothing", as if you have just created the dataset for the first time.
... View more
3 weeks ago
|
0
|
0
|
62
|
IDEA
|
On the back on this Q&A I wish to suggest the following idea, allow users to choose a larger number of bins in a histogram and not be constrained to 64 bins. I understand performance may degrade but sometime its helpful to push the number up. If a larger number is selected it would not be unreasonable for a warning to pop up and warn you that arcpro may become slower or unresponsive?
... View more
3 weeks ago
|
1
|
0
|
119
|
POST
|
I have a large float raster, there are 903,908,029 cells with a float value and using the create chart from the contents pane for the raster layer I can create a histogram. There is a slider to change the number of bins. I cannot move it beyond 34. By that I mean if I drag it to the end it says 64 then after it has recomputed the histogram it jumps back to 34. Why can't I change it to say 100 bins? The help file states: The number of bins defaults to the square root of the number of records in your dataset. You can adjust this by changing theBinsvalue on theDatatab of theChart Propertiespane. Changing the number of bins allows you to see more or less detail in the structure of your data. This statement seems to imply I can change it to what ever I like, but clearly it's constraining it in some way. Can someone from esri explain why this is happening as it appears not be documented in the help file (or I missed it)?
... View more
3 weeks ago
|
0
|
1
|
194
|
IDEA
|
Anyone from the geoprocessing\python esri team have a solution to this? If you have a toolbox (atbx) with dozens of embedded scripts how to do you en masse encrypt all scripts? I'm regularly hitting this problem and in need of a solution.
... View more
03-14-2024
05:16 AM
|
0
|
0
|
65
|
IDEA
|
Whilst ESRI ruminate on this idea, are you aware of whitebox tools? They have a tool that does exactly what you want RasterToVectorPolygons that will take a float raster. Its free and has been exposed as a geoprocessing toolbox which you can download here. You will be forced to accept it's limitations, for example it takes a tif as input and outputs shapefile, which are known to have size limitations. I tested on a small section of a DEM which is float32 The result is: I tried it on a larger section of DEM and it seemed to lock up but that was probably me being impatient. You might find if you have to process large rasters your existing work flow whilst irritating may be the fastest approach?
... View more
03-14-2024
03:09 AM
|
0
|
0
|
147
|
POST
|
Seems alright to me. Remember a multi-value comes back as semi-colon separated, so first thing I tend to do split it into a list. You can add the following line to your execute function to get an idea of what is returned. arcpy.AddMessage(account_numbers.split(";"))
... View more
03-13-2024
09:50 AM
|
0
|
0
|
115
|
IDEA
|
Your idea is unlikely to get traction. A float raster is typically some sort of surface where every pixel has a different value. Yes you might get areas where there are clusters of pixels with the same value but I would imagine that is an exception rather than a rule. So taking an elevation raster I have at hand it has 6558 cols by 12150 rows. That could very easily end up outputting a polygon dataset with 79,679,700 polygons! Start messing around with lidar derived data and you could end up with billions of polygons.
... View more
03-13-2024
09:35 AM
|
0
|
0
|
179
|
IDEA
|
When you select a feature layer in the Contents pane the Ribbon shows a Data tab for which you can access a variety of tools, a button I use almost hourly is the Attribute Table button that simply opens the attribute table. Yes I know I could have done CRTL+T or right clicked on the layer but the ribbon is what I tend to use. Now an Integer raster typically has an attribute table which I will want to view and interact with it. For example select a row and that selection feeds into the zonal stats tool. The ArcPro user experience is not the same when you access a raster layer, there is no button to open the attribute table associated with the raster. I would like to see an open attribute table button in the ribbon. Happy for it to be greyed out when raster is a float, that makes sense. Now I know I can customise the ribbon and add such a button but for the sake of having a consistent user experience such a button should be part of the default interface, no one should need to be messing around with creating custom groups for a button that gets used by everyone every day!
... View more
03-13-2024
09:12 AM
|
2
|
0
|
155
|
IDEA
|
@ReidHarwood11397 The Export Table tool has had its fields mapping interface much improved with ArcPro 3.2. They have removed the annoying feature of it forcing you to have an OID_ field in your csv file. They have given you much more control over what the field output is and in what order. In the example below I have included back the objectid field to show that you can indeed "put it back" as well as moving it to another position. Or in my user case get rid of it completely. Nice! There is a really good blog you should read - ArcGIS Pro 3.2 Field map enhancements and design updates (esri.com)
... View more
02-28-2024
02:41 AM
|
0
|
0
|
201
|
POST
|
I'm surprised you did not observe a reduction in processing time by moving the data into a file geodatabase. May be the shapefiles had a spatial index already built? You can tell this by simply opening up it's attribute table and if you can see a * next to the shape field header then you have a spatial index. Having seen your screenshot and knowing you are talking about millions of polygons which would suggest your dataset is covering a nation, another thing that can cripple performance is having large MULTIPART features that cover the majority of the extent of your data. If that is something you have in your data then another simple performance boosting thing you can do is to convert the data into SINGLEPART. You run your selection tool on those datasets. Just an idea, won't make your PC use all its cores but makes the query significantly more efficient.
... View more
02-28-2024
02:22 AM
|
1
|
0
|
295
|
POST
|
You can use this tool to extract the dangling node and then use that as the selecting layer in the select layer by location tool.
... View more
02-27-2024
04:22 AM
|
0
|
1
|
145
|
POST
|
Can you explain your data a little more? You seem to be saying that both your farm land and steep terrain polygons are "100m mesh polygons"? What do you mean by that, a picture would be helpful. You also say your data is stored as shapefiles, these are an old format and spatial indexing is something you need to add to the data. But as you are talking in the millions of polygons I would move the data into a file geodatabase, you will get an instant boost in performance. Your PC hardware is pretty high spec so I don't think you can improve upon that.
... View more
02-27-2024
04:09 AM
|
2
|
0
|
329
|
Title | Kudos | Posted |
---|---|---|
1 | 3 weeks ago | |
2 | 03-13-2024 09:12 AM | |
1 | 04-22-2023 08:30 AM | |
1 | 02-28-2024 02:22 AM | |
2 | 02-27-2024 04:09 AM |
Online Status |
Offline
|
Date Last Visited |
3 weeks ago
|