I received LAS points for a recent LIDAR survey, created a terrain in a 10.3.1 enterprise geodatabase (SDE) using LAS as multipoints, boundary as a clip polygon, but when I ran Terrain to Raster, I got a pattern of short straight lines of lost data around the edges. Dropbox - Screenshot 2018-01-05 12.39.28.png . When I tried creating an LASD, then used LAS Dataset to Raster, I got a similar pattern of data loss. Dropbox - Screenshot 2018-01-16 17.09.54.png The data loss does not match up to the LAS data frames. One workaround was to specify a cell size >= 3 when creating a raster from the terrain or the LASD, but that gives rasters with a disappointing level of detail. Dropbox - hillshade.jpg Another work around was to create an LASD with just 5 -6 LAS files and run LAS to Raster to SDE storage, and I was able to get a crisp raster with cell size 1 Dropbox - hillshade.jpg
The other workaround I discovered was I could run the whole intended tile size of LAS files with LAS to Raster or Terrain to Raster to file geodatabase storage and I could get the whole tile processed with cell size 1 without data loss, then import the whole raster into SDE storage without data loss. This was a bit disappointing because I had hoped processing and storage of these rasters would be improved in SDE.
This is with ArcDesktop 10.3.1, ArcServer 10.3.1, MSSQL Server 2012, Windows Server 2012.
Is there some SQL Server setting that needs to be tuned to create these rasters directly into SDE storage?