I have been in contact with Christine Dartiguenave from ESRI and she was able to figure out the correct syntax for this tool in python. This has been tested in ArcMap 10.2.2 and is integrated into a tool I have to convert multiple raw LAS files into multiple DEM tiles or one large mosaicked DEM. Of course the inputs and outputs and cellsize must be specified. The tool I have works best when the input LASD is generated using the MakeLasDatasetLayer_management to filter a class code such as 2 for Bare earth. Hope this helps.
arcpy.LasDatasetToRaster_conversion(inputLasd, outputRaster,"ELEVATION","TRIANGULATION NATURAL_NEIGHBOR NO_THINNING MAXIMUM 0","FLOAT","CELLSIZE",cellSize,"1")
Yes, it seems it is really a kind of bug and I am surprised, that nobody from ESRI did correct it so far. In all the help materials and all the script examples they still keep purely wrong information. As I tested the tool (using ArcGis 10.2 and Python 2.7) I found out, that the problem in utilizing this tool via script is in the amount of input items in the {interpolation_type} field. Even when you want a type of interpolation which is using no point thinning (which should really be stated as "NO_THINNING" instead of "NONE" as they advice in the tool help), you must input the items such as Point Selection Method and Resolution. Even though those items are not applied in the calculation, without filling them the tool is not running at all.
I have tested several options and I can say, that in my case are working all the settings using Binning, e.g.:
arcpy.LasDatasetToRaster_conversion(lasd, output, "ELEVATION", "BINNING IDW LINEAR", "FLOAT", "CELLSIZE", "1", "1")
and also settings using Triangulation such as:
arcpy.LasDatasetToRaster_conversion(lasd, output, "ELEVATION", "TRIANGULATION LINEAR NO_THINNING MAXIMUM 0", "FLOAT", "CELLSIZE", "1", "1")
arcpy.LasDatasetToRaster_conversion(lasd, output, "ELEVATION", "TRIANGULATION NATURAL_NEIGHBOR WINDOW_SIZE MAXIMUM 0", "FLOAT", "CELLSIZE", "1", "1") etc.
It seems that I can use any number instead of the "0" for Resolution and the result is always the same. Another strange thing is, that the result is not equal to the same settings with "0" when the tool is running in ArcGIS, but is equal to the result, which the tool in ArcGIS outputs with the default Resolution value calculated uniquely for each dataset (e.g. 0,153884). I did not find out how can I influence the thinning resolution settings at all..
I hope this helps, and others who will get stuck at this bug as I did will not need to test all the options again..
Hi community,
Similar problem with arcpy.LasDatasetToTin_3d where the results differ from manual (left) to python (right) settings.
My manual parameters:
which should equate these in python, based on the somewhat unclear documentation:
thintype = 'WINDOW_SIZE'
thinmethod = 'MAX'
thinvalue = '1'
maxnodes = '20000000'
zfactor = '1'
arcpy.LasDatasetToTin_3d(inlasd, outtin, thintype, thinmethod, thinvalue, maxnodes, zfactor)
Have those issues/bugs been resolved or am I also doing something wrong?
Thanks.