Select to view content in your preferred language

xyz to raster

6002
4
Jump to solution
10-31-2016 03:54 PM
forestknutsen1
MVP Regular Contributor

I have ~1500 xyz files (I also have the corresponding las files) that I want to make into DEMs. I have tested out the below workflow--looks like it works as expected.

How To: Convert points in XYZ file format to raster 

My point spacing is 3 feet for all of the xyz files. So, I am thinking I can just script or run in tools batch mode to get through the 1500 files. I am wondering if there is a best practice for the raster interpolation portion of the workflow? I used IDW for the test with the default setting.

Thanks!

0 Kudos
1 Solution

Accepted Solutions
CodyBenkelman
Esri Regular Contributor

Forest

As with so many questions, the simple (but not helpful) answer is "it depends".  

To be more helpful, I'll expand on some of the considerations - but as Dan indicated, you should test IDW vs. NaturalNeighbor on a small sample to see which will work best for you.  In general IDW will leave you with a rougher DEM (although possibly more accurate), whereas NatNeighbor will be smoother - but there are other considerations:

  1. Do your XYZ files have values at every XY location, or are there gaps?  (and are the XY values regularly spaced?)  IDW may not interpolate across large gaps (depending on settings) whereas NatNeighbor will fill your tiles to the edges.
  2. How much time have you invested in your current files?  If you've done a lot of editing and QC (I'm assuming these are bare earth ground points from LAS) and the file boundaries are very specific, you can proceed with this workflow, but I'm concerned about doing anything one tile at a time without considering neighbors - you may end up with slight artifacts at every edge that will give you an unpleasant result when all are combined.
  3. Is processing time a concern?  The support doc you reference will take more processing time than the alternative I describe below, although realistically the learning curve and setup time on my recommended workflow will likely take you longer overall.
  4. Who will use the outputs, and in what software?  (see end for explanation of this)

The above should guide you to pick a good area to test - I'd suggest finding an area with some larger "NoData" gaps (if you have any) especially at tile boundaries, and process either 4 tiles in 2x2 arrangement or perhaps more, and look at a hillshade of the resulting outputs to see if there are edge effects.  If any of your tiles have NoData at the edges (and you do fill the NoData voids), I'm sure you'll find discontinuities if you process these files separately to create DEMs.

If you have not invested a lot of effort thus far in the files, and/or if a small test does not work out well, I'd encourage you to look at our recommended workflow for generating DEMs from Lidar in the Image Management Guidebook (look for the Aerial Lidar section here:  Image Management ).  This would presume you have good ground classification for your points in the LAS files, but I'm suggesting you may not want to use the XYZ point files and just use our workflow to go from LAS to tiled DEMs in Tiff format.  This would process all of your data as one logical project (or several large projects, if it's really not contiguous data) to avoid edge artifacts, and the downloadable tool referenced in this workflow (LAS Dataset to Tiled Rasters) gives you a lot of flexibility for creating DEM tiles including file names, boundaries, and metadata to go with the tiles.

re: my question #4 above, an option that some organizations use is NOT to fill the NoData voids (or "not if greater than XX pixels"), and then when the DEM tiles are used (presuming users have either ArcGIS Desktop/Pro or a web client built with our APIs), if you manage them with a Mosaic Dataset and add the "Elevation Void Fill" function, you can have a watertight DEM but allow the user to control the parameters of the void filling that best suit their application, and also explicitly see where the voids are so they know which values are being interpolated.  I wouldn't suggest this if you have public users, but if you are serving technical staff in your organization, they may prefer to see the data voids before proceeding.

I hope that is helpful. 
Cody B.

View solution in original post

4 Replies
DanPatterson_Retired
MVP Emeritus

There is no perfect interpolator, pick one if you don't like the results then you can see if others work better given the nature of your data.

CodyBenkelman
Esri Regular Contributor

Forest

As with so many questions, the simple (but not helpful) answer is "it depends".  

To be more helpful, I'll expand on some of the considerations - but as Dan indicated, you should test IDW vs. NaturalNeighbor on a small sample to see which will work best for you.  In general IDW will leave you with a rougher DEM (although possibly more accurate), whereas NatNeighbor will be smoother - but there are other considerations:

  1. Do your XYZ files have values at every XY location, or are there gaps?  (and are the XY values regularly spaced?)  IDW may not interpolate across large gaps (depending on settings) whereas NatNeighbor will fill your tiles to the edges.
  2. How much time have you invested in your current files?  If you've done a lot of editing and QC (I'm assuming these are bare earth ground points from LAS) and the file boundaries are very specific, you can proceed with this workflow, but I'm concerned about doing anything one tile at a time without considering neighbors - you may end up with slight artifacts at every edge that will give you an unpleasant result when all are combined.
  3. Is processing time a concern?  The support doc you reference will take more processing time than the alternative I describe below, although realistically the learning curve and setup time on my recommended workflow will likely take you longer overall.
  4. Who will use the outputs, and in what software?  (see end for explanation of this)

The above should guide you to pick a good area to test - I'd suggest finding an area with some larger "NoData" gaps (if you have any) especially at tile boundaries, and process either 4 tiles in 2x2 arrangement or perhaps more, and look at a hillshade of the resulting outputs to see if there are edge effects.  If any of your tiles have NoData at the edges (and you do fill the NoData voids), I'm sure you'll find discontinuities if you process these files separately to create DEMs.

If you have not invested a lot of effort thus far in the files, and/or if a small test does not work out well, I'd encourage you to look at our recommended workflow for generating DEMs from Lidar in the Image Management Guidebook (look for the Aerial Lidar section here:  Image Management ).  This would presume you have good ground classification for your points in the LAS files, but I'm suggesting you may not want to use the XYZ point files and just use our workflow to go from LAS to tiled DEMs in Tiff format.  This would process all of your data as one logical project (or several large projects, if it's really not contiguous data) to avoid edge artifacts, and the downloadable tool referenced in this workflow (LAS Dataset to Tiled Rasters) gives you a lot of flexibility for creating DEM tiles including file names, boundaries, and metadata to go with the tiles.

re: my question #4 above, an option that some organizations use is NOT to fill the NoData voids (or "not if greater than XX pixels"), and then when the DEM tiles are used (presuming users have either ArcGIS Desktop/Pro or a web client built with our APIs), if you manage them with a Mosaic Dataset and add the "Elevation Void Fill" function, you can have a watertight DEM but allow the user to control the parameters of the void filling that best suit their application, and also explicitly see where the voids are so they know which values are being interpolated.  I wouldn't suggest this if you have public users, but if you are serving technical staff in your organization, they may prefer to see the data voids before proceeding.

I hope that is helpful. 
Cody B.

forestknutsen1
MVP Regular Contributor

Thanks for the feedback Cody.

1) The xyz data looks tight. There are no gaps and the points are at 3 foot spacing across all of the files I have looked at.

2) This is a vendor provided dataset. They have given us bare earth as well as what they call "Full Feature" (I am not sure if that is an industry standard term but it is the surface reflection). But, all I care about is the bare earth. I don't have a lot of time into the processing so far. Mostly, just research to understand the correct path forward. I would guess that the vendor has insured that the edges of the xyz "tiles" mesh up nicely. Artifacts on the seams would be bad.

3) Processing time is not a big deal. I would like to be done in a few days. Based on my work so far I think this is realistic.  

4) The data will be used in ArcGIS desktop and other geospatial applications such as Petrosys. In the end it is a little hard to predict all of the ways that it will be used. My plan is to make tiffs off of it and then create a mosaic dataset. People can use the tiffs if their application can not consume the mosaic. Thank you for the void fill tip.

If this workflow does not pan out I will try your alternate workflow. 

0 Kudos
DanPatterson_Retired
MVP Emeritus

To comment on Cody's statement about 'it depends' not being useful.

Perhaps the shear number of interpolators and their location in the toolsets is a first clue that this is a field of venture where 'is depends' should guide one to examine what each interpolator works best at and what each one does.

3D analyst            An overview of the Raster Interpolation toolset—Help | ArcGIS for Desktop 

Geostats analyst An overview of the Interpolation toolset—Help | ArcGIS for Desktop 

Spatial analyst      An overview of the Interpolation toolset—Help | ArcGIS for Desktop 

There are some duplication, but some unique to the toolset.

Then there is the question of your data... once you understand the interpolators, then you can view them in light of your data.