I have a bit of sticky problem and was hoping someone might be able to provide some guidance towards a solution.
I have two DEM Rasters stored in a GDB. One is from 2018 and is approx 0.762 meter x 0.762 cell size, well call it Raster A (fig.1). The other is from 2009 and is approximately 10 meter x 10 meter cell size, well call it Raster B (fig. 1). Raster A is the Topography of a barrier island, while Raster B is the bathymetry surrounding it. My Issue lies in that between the two dates there has been significant erosion of the beach profile so that there is a pretty steep discontinuity between my high-resolution, current data and my older, low-resolution data (fig. 2, fig. 3).
Figure 3: Ideally the slope would look something like the red line drawn.
I need to figure out a way to determine a method which appropriately smooths between the Raster A and Raster B so that the discontinuity is no longer present along the entire coastline (approx. 22 kilometers). Some areas of the coastline exhibit a wider discontinuity than others.
At the moment my only solution is to delete a slice of Raster B around Raster A and then interpolate between the two rasters. I will likely have to make this slice wide enough to encompass all the discontinuous areas. In Figure 4 you can see the first attempt at creating a buffer zone covers the lower section well enough, but not the upper section.
Figure 4: The boundary to the left is the end of the high-res DEM (Raster A). Everything right of that line is Raster B.
I appreciate any and all suggestions!