Note: I should move the every-expanding philosophy to its own page and give more detail on demos --
JackSnoeyink - 02 Jun 2008
Our thesis has always been that the amount of data and the level of interest varies across terrain depending on the application, a fact that is most easily seen with bare earth lidar. Thus, our project considers how to represent and efficiently work with such irregularly sampled data. Point selection is key part of representation for three reasons:
- The most efficient representations (including splines and wavelets) capture many sample points as a combination of basis elements described by few parameters; often a selected set of points determines which basis elements and what parameters.
- The sampling rate may not reflect the level of interest; we may wish to select the data capturing roof corners (or registration) or curbs (for drainage), and reduce the data representing the roofs and roads themselves.
- For any sensor, some fraction of data points are erroneous and should be deleted; this greatly complicates the task of point selection for level of interest and efficient representation.
Point Selection Framework
In our project, we have developed a point selection framework based on three methods: local neighborhood fitting, refinement, and decimation.
Local neighborhood fitting and filtering
A notion of the neighborhood of a point and a method of fitting give a way to classify points as erroneous, unimportant, or important.
Notions of neighborhood
Waldo Tobler's first law of geography is "Everything is related, but nearby things are more related than distant things." Raster representations have a natural notion of the neighborhood of a point -- you simply take all the grid points within a given distance of the point, subject to the limitations imposed by the grid spacing and the number of points that can be handled. When one applies this idea to irregular data, there are additional choices: the neighborhood can be the points within spatial proximity, as in the grid, or can be the k nearest points, or the k-ring of a triangulation. We focus on the k-ring for meshes, but also use the finalization grid to define neighborhoods.
Fitting methods
In urban data, points on roofs or walls are often well-fit by a simple plane.
In more natural portions of the terrain, subdivision surfaces and wavelets are natural choices for fitting. We have implemented a number of fitting methods in MATLAB prototypes, and ported the most promising to
StreamingModules.
Filtering
The development of our streaming tools that support auxiliary data (See
StreamingModules) is motivated by the need to annotate points with the results of neighborhood and fitting information to support selection and filtering. Since each new data set has its own characteristics, the aim is to provide efficient, flexible tools that can classify and filter in this framework.
RingFilt and
classx implement streaming modules that are among our first examples.
Refinement
Refinement begins with a few points and refines the representation where the error is large. The VIP (very important point) insertion heuristic of Chen & Guevara (1988) was to simply add the point having greatest error (see also Garland and Heckbert's
scape) but this tends to incorporate erroneous points into the model. By using refinement with local fitting we can reject erroneous points during refinement.
RTINRefinementDemo is a demo from one of our MATLAB prototypes.
Decimation
Decimation begins with a complete model, then finds and removes points whose deletion causes the least change.
smsimp uses this method with the
QuadricErrorMetric.
Compressing TINs
By the way, I should mention our compressed mesh format that allows us to store the irregular mesh connectivity far more efficiently than the traditional TIN representations. For details and executable demos, please see
StreamingCompression or visit
http://www.cs.unc.edu/~isenburg/smc/
--
JackSnoeyink - 30 May 2008
--
ChristianStith - 09 Jun 2008
to top