Results 1  10
of
11
Natural neighbor interpolation based grid dem construction using a gpu
 In ACM GIS ’10: Proceedings of the 18th ACM SIGSPATIAL International Symposium on Advances in Geographic Information Systems
, 2010
"... With modern LiDAR technology the amount of topographic data, in the form of massive point clouds, has increased dramatically. One of the most fundamental GIS tasks is to construct a grid digital elevation model (DEM) from these 3D point clouds. In this paper we present a simple yet very fast algorit ..."
Abstract

Cited by 2 (1 self)
 Add to MetaCart
With modern LiDAR technology the amount of topographic data, in the form of massive point clouds, has increased dramatically. One of the most fundamental GIS tasks is to construct a grid digital elevation model (DEM) from these 3D point clouds. In this paper we present a simple yet very fast algorithm for constructing a grid DEM from massive point clouds using natural neighbor interpolation (NNI). We use a graphics processing unit (GPU) to significantly speed up the computation. To handle the large data sets and to deal with graphics hardware limitations clever blocking schemes are used to partition the point cloud. For example, using standard desktop computers and graphics hardware, we construct a highresolution grid with 150 million cells from two billion points in less than thirtyseven minutes. This is about onetenth of the time required for the same computer to perform a standard linear interpolation, which produces a much less smooth surface.
Lipschitz unimodal and isotonic regression on paths and trees
, 2008
"... Let M = (V, A) be a planar graph, let γ ≥ 0 be a real parameter, and t: V → R a height function. A γLipschitz unimodal regression (γLUR) of t is a function s: V → R such that s has a unique local minimum, s(u) − s(v)  ≤ γ for each {u, v} ∈ A, and ‖s − t‖2 = ∑ v∈V (s(v) − t(v))2 is minimized. ..."
Abstract

Cited by 2 (1 self)
 Add to MetaCart
Let M = (V, A) be a planar graph, let γ ≥ 0 be a real parameter, and t: V → R a height function. A γLipschitz unimodal regression (γLUR) of t is a function s: V → R such that s has a unique local minimum, s(u) − s(v)  ≤ γ for each {u, v} ∈ A, and ‖s − t‖2 = ∑ v∈V (s(v) − t(v))2 is minimized. Here, a local minimum of s is a vertex v such that s(u)> s(v) for any neighbor u of v. For a directed planar graph, s: V → R is the γLipschitz isotonic regression (γLIR) of t if s(u) ≤ s(v) ≤ s(u)+γ for each directed edge (u, v) and ‖s − t‖2 is minimized. These problems arise, for example, in topological simplification of a height function. We present nearlineartime algorithms for LUR and LIR problems for two special cases where M is a path or a tree.
The Complexity of Flow on Fat Terrains and its I/OEfficient Computation
"... We study the complexity and the I/Oefficient computation of flow on triangulated terrains. We present an acyclic graph, the descent graph, that enables us to trace flow paths in triangulations i/oefficiently. We use the descent graph to obtain i/oefficient algorithms for computing river networks ..."
Abstract

Cited by 2 (1 self)
 Add to MetaCart
We study the complexity and the I/Oefficient computation of flow on triangulated terrains. We present an acyclic graph, the descent graph, that enables us to trace flow paths in triangulations i/oefficiently. We use the descent graph to obtain i/oefficient algorithms for computing river networks and watershedarea maps in O(Sort(d + r)) i/o’s, where r is the complexity of the river network and d of the descent graph. Furthermore we describe a data structure based on the subdivision of the terrain induced by the edges of the triangulation and paths of steepest ascent and descent from its vertices. This data structure can be used to report the boundary of the watershed of a query point q or the flow path from q in O(l(s) + Scan(k)) i/o’s, where s is the complexity of the subdivision underlying the data structure, l(s) is the number of i/o’s used for planar point location in this subdivision, and k is the size of the reported output. On αfat terrains, that is, triangulated terrains where the minimum angle of any triangle is bounded from below by α, we show that the worstcase complexity of the descent graph and of any path of steepest descent is O(n/α 2), where n is the number of triangles in the terrain. The worstcase complexity of the river network and the abovementioned data structure on such terrains is O(n 2 /α 2). When α is a positive constant this improves the corresponding bounds for arbitrary terrains by a linear factor. We prove that similar bounds cannot be proven for Delaunay triangulations: these can have river networks of complexity Θ(n 3). 1
I/OEfficient Algorithms for Computing Contours on a Terrain
 In SCG ’08: Proceedings of the twentyfourth annual symposium on Computational geometry
, 2008
"... A terrain M is the graph of a bivariate function. We assume that M is represented as a triangulated surface with N vertices. A contour (or isoline) of M is a connected component of a level set of M. Generically, each contour is a closed polygonal curve; at “critical ” levels these curves may touch e ..."
Abstract

Cited by 1 (1 self)
 Add to MetaCart
A terrain M is the graph of a bivariate function. We assume that M is represented as a triangulated surface with N vertices. A contour (or isoline) of M is a connected component of a level set of M. Generically, each contour is a closed polygonal curve; at “critical ” levels these curves may touch each other or collapse to a point. We present I/Oefficient algorithms for the following two problems related to computing contours of M: (i) Given a sequence ℓ1 < · · · < ℓs of real numbers, we present an I/Ooptimal algorithm that reports all contours of M at heights ℓ1,..., ℓs using O(sort(N) + T/B) I/Os, where T is the total number edges in the output contours, B is the “block size, ” and sort(N) is the number of I/Os needed to sort N elements. The
Evaluating Hydrology Preservation of Simplified Terrain Representations
"... We present an error metric based on the potential energy of water flow to evaluate the quality of lossy terrain simplification algorithms. Typically, terrain compression algorithms seek to minimize RMS (root mean square) and maximum error. These metrics fail to capture whether a reconstructed terrai ..."
Abstract
 Add to MetaCart
We present an error metric based on the potential energy of water flow to evaluate the quality of lossy terrain simplification algorithms. Typically, terrain compression algorithms seek to minimize RMS (root mean square) and maximum error. These metrics fail to capture whether a reconstructed terrain preserves the drainage network. A quantitative measurement of how accurately a drainage network captures the hydrology is important for determining the effectiveness of a terrain simplification technique. Having a measurement for testing and comparing different models has the potential to be widely used in numerous applications (flood prevention, erosion measurement, pollutant propagation, etc). In this paper, we transfer the drainage network computed on reconstructed geometry onto the original uncompressed terrain and use our error metric to measure the level of error created by the simplification. We also present a novel terrain simplification algorithm based on the compression of hydrology features. This method and other terrain compression schemes are then compared using our new metric.
I/OEfficient Computation of Water Flow Across a Terrain
"... Consider rain falling at a uniform rate onto a terrain T represented as a triangular irregular network. Over time, water collects in the basins of T, forming lakes that spill into adjacent basins. Our goal is to compute, for each terrain vertex, the time this vertex is flooded (covered by water). We ..."
Abstract
 Add to MetaCart
Consider rain falling at a uniform rate onto a terrain T represented as a triangular irregular network. Over time, water collects in the basins of T, forming lakes that spill into adjacent basins. Our goal is to compute, for each terrain vertex, the time this vertex is flooded (covered by water). We present an I/Oefficient algorithm that solves this problem using O(sort(X) log(X/M) + sort(N)) I/Os, where N is the number of terrain vertices, X is the number of pits of the terrain, sort(N) is the cost of sorting N data items, and M is the size of the computer’s main memory. Our algorithm assumes that the volumes and watersheds of the basins of T have been precomputed using existing methods.
I/OEfficient Contour Queries on Terrains
"... A terrain M can be represented as a triangulation of the plane along with a height function associated with the vertices (and linearly interpolated within the edges and triangles) of M. We investigate the problem of answering contour queries on M: Given a height ℓ and a triangle f of M that intersec ..."
Abstract
 Add to MetaCart
A terrain M can be represented as a triangulation of the plane along with a height function associated with the vertices (and linearly interpolated within the edges and triangles) of M. We investigate the problem of answering contour queries on M: Given a height ℓ and a triangle f of M that intersects the level set of M at height ℓ, report the list of the edges of the connected component of this level set that intersect f, sorted in clockwise or counterclockwise order. Contour queries are different from levelset queries in that only one contour (connected component of the level set) out of all those that may exist is expected to be reported. We present an I/Oefficient data structure of linear size that answers a contour query in O(log B N + T/B) I/Os, where N is the number of triangles in the terrain and T is the number of edges in the output contour. The data structure can be constructed using O(Sort(N)) I/Os.
Cleaning Massive Sonar Point Clouds ABSTRACT
"... We consider the problem of automatically cleaning massive sonar data point clouds, that is, the problem of automatically removing noisy points that for example appear as a result of scans of (shoals of) fish, multiple reflections, scanner selfreflections, refraction in gas bubbles, and so on. We de ..."
Abstract
 Add to MetaCart
We consider the problem of automatically cleaning massive sonar data point clouds, that is, the problem of automatically removing noisy points that for example appear as a result of scans of (shoals of) fish, multiple reflections, scanner selfreflections, refraction in gas bubbles, and so on. We describe a new algorithm that avoids the problems of previous localneighbourhood based algorithms. Our algorithm is theoretically I/Oefficient, that is, it is capable of efficiently processing massive sonar point clouds that do not fit in internal memory but must reside on disk. The algorithm is also relatively simple and thus practically efficient, partly due to the development of a new simple algorithm for computing the connected components of a graph embedded in the plane. A version of our cleaning algorithm has already been incorporated in a commercial product.
Efficient extraction of drainage networks from massive, radarbased
, 2010
"... www.hydrolearthsystsci.net/15/667/2011/ ..."