Trying to generalize some 3D lines, but it does not seem to obey the distance tolerance in the Z space. Documentation doesn't seem to say anything either way. Would like to confirm if the generalizer even supports 3D? If not, any recommendations on simplifying a 3D line?
Actually trying to do this in FME, but it was simplifying the same way... so using an ArcGIS GP was going to be the backup plan until it appears to not obey Z values either. Running 10.4 at the moment.
Thanks a lot!
I will say no... generalization in 2D has some application such as data simplification based on proximity or duplicates.
I think to generalize in the Z axis would be a bit more complicated since one would presumably have to perform the generalization in the xy directions first.
Now will there be a counter-point???
Hi Dan,
Just found this discussion, and I have another application case where I would need 3D generalization. My customer maintains a road network and wants to print kilometrage on maps that can be used by car drivers and should be consistent with their trip meters. Since part of the network is in mountainous terrain, we decided to use 3D length derived from a 10m DGM and accumulate it in M values. In a first step, we get the Z component onto the road nethwork with the Interpolate Surface tool, so that we can calculate the 3D length. You might argue that this does not represent the "true" length of the road, but it is certainly closer to reality than the 2D length.
The tool densifies the road network with vertices at about every 10 m, blowing up the size of the dataset by about 100%. We now want to generalize the dataset to get rid of vertices that do not have an influence on 3D length and the M values - that is, vertices that fall on a straight line in 3D space (straight line in XY space and zero or constant slope).
My approach for now is this:
I still think that the Douglas Peucker algorithm should not be too difficult to be implemented in 3D space, so maybe I will give it a try some day as an alternative.
I found what I was looking for. There is a Generalize3D method in the ArcObjects interface IPolycurve3D. It just hasn't been exposed in a toolbox tool.
Hi Dan, thanks for that information.
We are basically after simplification based on proximity. We have ~60,000 3D lines each varying from 1,500 to 6,000 feet long. Each of these lines has a vertex at each foot along it, that's a lot! So after running the current generalize tool, a line might go from 2,000 vertices to 100. We just need the line to stay within a tolerance of about 1/2 a foot.
Our drivers for this are:
1. The performance on this dataset in ArcMap and ArcScene is tremendously better.
2. We need to intersect these 3D lines with 3D multipatchs... that intersection GP tool works great but takes many, many hours when run on the non-generalized dataset. The multipatches that are intersecting these lines change often so we have a nightly job to recalculate the intersections.
That's our reasoning for this at least. Thanks again for the help and if you or anyone might have any recommendations to solve this issue, that would be much appreciated.
I am curious now. Are your lines straight? or do they collectively follow a pattern?
I am thinking along the lines of your data forming a 3D array. It is easy to determine points of departure in the z direction, then generalize in the x,y based on that, perhaps rather than generalizing in the xy, first. If you have any more comments on the data form that would be interesting.
I think that generalization in the Z axis is depend on
After create DEM make your line get elevation from DEM surface.
layer properties > base heights >elevation from surface > floating on a custom surface
Abdullah...this is for vector, not raster
x,y,z data can be represented as either sparse or dense arrays and it is possible to generalize the array in the 3rd dimension, then extract those array elements that either deviate globally or sequentially from a specified value or within a range.
As a sparse array, you are not confined by a cell size which limits the spatial resolution to finite units since there is no requirement for equal locational spacing.
This type of analysis currently is the forte of analytical tools such as numpy, scipy, r etc which we are beginning to interface with arcmap.
Yes that logically for vector, look to Ryan Coodey Question,Dan.
" Trying to generalize some 3D lines" is line Raster ??????
about cell size
when you get elevation from survey points , you shouldn't waste any one of them
for example, If I have 5 survey points in one meter square and adjust my cell size to 1 meter , that is meaning wasting for survey.
you should make your cell size not have more than one point ( In 3d modeling application).
So you should be carious for cell size and make it small enough to don't contain more than one point, and it will make the Lines smooth as far as survey enables us.
I'm not talking about dividing a cell pixel after creation a raster or increase pixels without varies data, Dan
because its output will be same pixel value and we will not see a difference.
They are working with vectors, so I am not sure of your point. Rasters and arrays are two different data structures which might share a often similar outward appearance visually but can be completely different conceptually and structurally.