Wentao,
Thanks for the questions and feedback.
Making a DEM:
The different pyramid filter options affect how data gets thinned and generalized. They do not affect the full resolution data. If you are only building terrains in order to create rasters using the full resolution data (TerrainToRaster pyramid resolution level set to 0.0) then it doesn't matter if you use windowsize or z-tolerance or what pyramid resolutions were specified when defining the terrain. Going on this assumption, I recommend using the windowsize filter since it's faster. You can also build the terrain using a much larger windowsize than 2 (e.g., 50), it will be faster. If, on the other hand, you want to rasterize using a thinned pyramid level instead of the full resolution level then you need to start thinking about pyramid type and pyramid level resolutions.
If you choose to use windowsize, and might rasterize on a thinned pyramid level, or view the terrain at different display scales, and your lidar is bare earth, then I think the ZMean option is a good one to use. If you have first return lidar then I recommend ZMax.
You can build a 1 meter DEM with a terrain made from any windowsize. The question is, what pyramid level are you performing the rasterization on? If your lidar is sampled at 1 meter, and you want to make a 1 meter DEM, then it really doesn't matter what pyramid type you use or the pyramid resolution definitions because when you run TerrainToRaster you ought to rasterize using the full resolution data by specifying pyramid resolution level = 0.0. I would not recommend building the 1-meter DEM using a coarser resolution pyramid level. On the other hand, if you wanted to make a 2-meter DEM from 1-meter lidar points, than rasterizing based on a 2-meter windowsize is reasonable. It will run faster, not just because of the larger cellsize, but also because you're processing a thinned point set.
If your terrain's tilesize is always 447 it means you're always saying the point spacing is 1. The tilesize we use depends on the point spacing you declare. So, does all your lidar have a 1 meter nominal point spacing? If so, then this is fine.
Out of memory
The 'out of memory' error you got when building the terrain likely related to data density. The first thing to check is the point spacing used to define the terrain. Is it correct? What is the nominal point spacing of the lidar and is that the value used to define the terrain? Another consideration is breaklines. If you have very densely sampled breaklines these should influence your point spacing estimate.
Are some areas of your terrain sampled more densely than others? If you're building a terrain from data that has different sample densities then you should specify the point spacing of the densest data when defining the terrain. For example, if an urban core is sampled at twice the density of the rest of the data used to build the terrain, specify the point spacing of the urban core. The terrain's tile system needs to be based off the densest data.
If you think your data is of relatively constant density I suggest testing that to make sure. Do this using PointToRaster with your lidar multipoint feature class. Specify an output cellsize that's 4x the point spacing of your lidar. Use the COUNT option as the cell assignment type. Note that the value field doesn't matter when using the COUNT option. If your lidar has a 1-meter point spacing, and you COUNT on a 4 meter cellsize, then you would expect to see, on average, a value of 16 in the resulting raster cells. If this isn't the case, if there's a significantly larger count somewhere, this could explain the problem.
Regards, Clayton