Hello,
I'm currently using ArcGIS Pro 3.1.2 for a research project, and I'm at the point where I need to use the Hillshade tool.
Originally I had a polygon layer portraying a bunch of different buildings. I added the heights (determined through some other process) to each building through a table join, and converted the polygon layer to a raster. Now I have a layer with building pixels, which have values based on the height of each building (see attachment 1).
When I feed in parameters (see attachment 2) to the Hillshade tool, the resulting hillshade raster appears very strange. I would expect the result to appear more like attachment 3, but it ends up more like attachment 4. Is there something wrong with the data, or processes, or methods with how I am attempting to arrive at a result like attachment 3?
I also changed the symbology for attachment 4 to exaggerate the differences in the result. I got a raster that seemed a bit nonsensical to me -- these are not shadow patterns that I'd expect to see after running the Hillshade tool as the change is limited to a single subset area of the entire study area, and the shape of the shadows don't really match the surrounding building obstructions.
Attachment 1 - Starting Building Raster Layer
Attachment 2 - Hillshade Tool Inputs
Attachment 3 - Expected Result
Attachment 4 - Actual Result (almost all pixels have value at 0)
Attachment 4 With Changed Symbology:
Entire dataset zoomed out
Subset of dataset zoomed in
Here is also a link to a supporting research paper for further context: https://www.sciencedirect.com/science/article/pii/S0306261916309424?via%3Dihub
Hi @lucasc39,
I think you are on the right track as you suggested "almost all pixels have value at 0"
When we run Polygon to Raster tool, it is only assigning values within the building footprint. Cells outside of the building footprint will be assigned as NoData. The below document goes into further detail why NoData values may be problematic for analysis.