Hello, I searched for this, but I just can't find a solution.
Here's the problem: I have roads as a vector line file (geodatabase). I have a hazard layer as a raster file (values are from 0 to 10). I would like to assign a hazard rating (1 to 10) to each road. However, hazard changes dramatically from one place along the line to another. So, I know I need to segment the lines. I'll probably do them in 200-foot intervals. With that in mind, I created points along the roads every 100 feet.
My idea was to extract the values from the raster layer to the 100-ft interval points, then do a spatial join to the 200ft-segmented roads.
However, where the roads are, in the hazard layer, there is no hazard (0) because its a road in the raster too.
Is there a way to extract the average of all the cells in a 200ft centered radius around each of the points? And can I somehow ignore 0 values?
I suspect I need to do this in python, but hoping there's a tool I'm not aware of.
Thanks in advance...
Solved! Go to Solution.
Okay - here's what I did and it seemed to work (much quicker than I had initial thought it would - bonus!):
That did the treat! Thank you all!