I am testing out various methods for extracting values from raster datasets to points as a demonstration to my students of various surface analysis workflows. This example in particular is extracting slope (degrees) values.
Here are the methods I'm using:
Traditional/local data: Use GP tools to derive a slope raster from a local elevation dataset, then extract the values to points (works as expected).
Local data/raster functions: Use the slope raster function on the local DEM, extract values to points (works as expected).
Terrain (Slope in Degrees) Image service: Extract slope values from the Image service. The values are integer, which is fine, and the results are comparable to the traditional methods (works as expected).
Terrain Image Service with slope raster function: I apply the slope raster function to the Terrain Image Service, then extract the values to points...This returns slope values that are much greater than expected (ex. expected value of 4 deg, but a returned value of 23 deg) and I can't figure out the reason why.
I suspect this may be related to the display scale of the terrain image service, but I can't seem to find out why in order to fix the workflow on my end. My biggest concern is that my students (or I) will end up with erroneous data by incorrectly using this method in the future. I need to better understand what's going on under the hood to avoid potential pit-falls.
As a side note, I ran the 'Difference' process tool/function on the correct slope dataset and the incorrect one and the results were zero, indicating to me that the correct values are in there somewhere.
Using ArcGIS Pro 2.3.1