LAS to Raster conversion

03-25-2014 06:04 AM
New Contributor

I'm having a problem setting the interpolation type in the arcpy.LasDatasetToRaster_conversion python code...I wish to use:

Triangulation, LINEAR, NO_THINNING as the options...

however there appears to be no choice when hovering over interpolation type in the python window and I receive the runtime error "Cannot set input into parameter interpolation_type when using 'TRIANGULATION LINEAR WINDOW_SIZE 10' as described in the ArcGIS 10.1 help guide:

The guide talks about arcpy.LasDatasetToRaster_3d in the python window (this doesn't appear to exist...I have checkout out 3D analyst) and arcpy.conversion.LasDatasetToRaster for the standalone python script brings the same runtime error...

Is this a known ESRI bug or am I doing something wrong??? Any help would be much appreciated,


Tags (2)
0 Kudos
14 Replies
New Contributor

I tried following the bug support: NIM-092807

I change the interpolation type to  "TRIANGULATION NO_THINNING MAXIMUM 1"

Input code: arcpy.conversion.LasDatasetToRaster("test", "test2", "ELEVATION",  "TRIANGULATION NO_THINNING MAXIMUM 1", "FLOAT", "CELLSIZE", "1", "1")

I received the same runtime error as previous: Runtime error  Traceback (most recent call last):   File "<string>", line 1, in <module>   File "c:\program files (x86)\arcgis\desktop10.1\arcpy\arcpy\", line 2329, in LasDatasetToRaster     raise e ExecuteError: ERROR 000622: Failed to execute (LAS Dataset to Raster). Parameters are not valid. ERROR 000628: Cannot set input into parameter interpolation_type. 


0 Kudos
Esri Esteemed Contributor
I would recommend contacting Tech Support.  This may be a bug.  Tech Support will be able to log it if it is and find a possible workaround.
0 Kudos
Occasional Contributor III

I have been in contact with Christine Dartiguenave from ESRI and she was able to figure out the correct syntax for this tool in python. This has been tested in ArcMap 10.2.2 and is integrated into a tool I have to convert multiple raw LAS files into multiple DEM tiles or one large mosaicked DEM. Of course the inputs and outputs and cellsize must be specified. The tool I have works best when the input LASD is generated using the MakeLasDatasetLayer_management to filter a class code such as 2 for Bare earth. Hope this helps.

arcpy.LasDatasetToRaster_conversion(inputLasd, outputRaster,"ELEVATION","TRIANGULATION NATURAL_NEIGHBOR NO_THINNING MAXIMUM 0","FLOAT","CELLSIZE",cellSize,"1")

0 Kudos
New Contributor

Yes, it seems it is really a kind of bug and I am surprised, that nobody from ESRI did correct it so far. In all the help materials and all the script examples they still keep purely wrong information. As I tested the tool (using ArcGis 10.2 and Python 2.7) I found out, that the problem in utilizing this tool via script is in the amount of input items in the {interpolation_type} field. Even when you want a type of interpolation which is using no point thinning (which should really be stated as "NO_THINNING" instead of "NONE" as they advice in the tool help), you must input the items such as Point Selection Method and Resolution. Even though those items are not applied in the calculation, without filling them the tool is not running at all.

I have tested several options and I can say, that in my case are working all the settings using Binning, e.g.:

arcpy.LasDatasetToRaster_conversion(lasd, output, "ELEVATION", "BINNING IDW LINEAR", "FLOAT", "CELLSIZE", "1", "1")

and also settings using Triangulation such as:

arcpy.LasDatasetToRaster_conversion(lasd, output, "ELEVATION", "TRIANGULATION LINEAR NO_THINNING MAXIMUM 0", "FLOAT", "CELLSIZE", "1", "1")

arcpy.LasDatasetToRaster_conversion(lasd, output, "ELEVATION", "TRIANGULATION NATURAL_NEIGHBOR WINDOW_SIZE MAXIMUM 0", "FLOAT", "CELLSIZE", "1", "1") etc.

It seems that I can use any number instead of the "0" for Resolution and the result is always the same. Another strange thing is, that the result is not equal to the same settings with "0" when the tool is running in ArcGIS, but is equal to the result, which the tool in ArcGIS outputs with the default Resolution value calculated uniquely for each dataset (e.g. 0,153884). I did not find out how can I influence the thinning resolution settings at all..

I hope this helps, and others who will get stuck at this bug as I did will not need to test all the options again..

0 Kudos
New Contributor

Hi community,

Similar problem with arcpy.LasDatasetToTin_3d where the results differ from manual (left) to python (right) settings.

My manual parameters:

which should equate these in python, based on the somewhat unclear documentation:

thintype = 'WINDOW_SIZE'
thinmethod = 'MAX'
thinvalue = '1'
maxnodes = '20000000'
zfactor = '1'

arcpy.LasDatasetToTin_3d(inlasd, outtin, thintype, thinmethod, thinvalue, maxnodes, zfactor)

Have those issues/bugs been resolved or am I also doing something wrong?


0 Kudos