I have created numerous line density maps of geologic faults with ArcGIS Pro Line Density tool. I specify a cell size and a search size. e.g. 5 and 25; 25 and 50 respectively and always specify output "area units" as 'square kilometers' . The resulting raster never has a unit cell size (5 or 25) that I specified. It is always considerably larger. The fault layer is WGS 1984 geographic coordinate.
Also, shouldn't the resulting unit output value (km/sq km) always be the same regardless of the cell size or search area size? I am getting different results for each run with different parameters.
Thank you
Laurie
Solved! Go to Solution.
Laurie .... consider km per sq km to be a convenience expression. If your x, y and z values are in meters, then you expect the numeric expression to be proportional to converting area from m^2 to km^2 etcetera.
When you start mixing geographic coordinates and/or imperial/US units, I prefer to do the proper prep step and convert everything myself rather than rely on assumptions of conversion (unless you test yourself.)
A lot can go wrong if you don't use projected data in the first place or forget to specify the projected output coordinate system in lieu of that. I would also have a look at the cell size projection method mentioned in the help topic since it will have an impact on the resultant raster.
Line Density (Spatial Analyst)—ArcGIS Pro | Documentation
My suggestion... project the raster first ensuring that you get cell sizes in meters squared, then run the line density tool
Laurie .... consider km per sq km to be a convenience expression. If your x, y and z values are in meters, then you expect the numeric expression to be proportional to converting area from m^2 to km^2 etcetera.
When you start mixing geographic coordinates and/or imperial/US units, I prefer to do the proper prep step and convert everything myself rather than rely on assumptions of conversion (unless you test yourself.)