Results 1  10
of
11
Optimal Regular Volume Sampling
"... The classification of volumetric data sets as well as their rendering algorithms are typically based on the representation of the underlying grid. Grid structures based on a Cartesian lattice are the defacto standard for regular representations of volumetric data. In this paper we introduce a more ..."
Abstract

Cited by 39 (7 self)
 Add to MetaCart
The classification of volumetric data sets as well as their rendering algorithms are typically based on the representation of the underlying grid. Grid structures based on a Cartesian lattice are the defacto standard for regular representations of volumetric data. In this paper we introduce a more general concept of regular grids for the representation of volumetric data. We demonstrate that a specific type of regular lattice  the socalled bodycentered cubic  is able to represent the same data set as a Cartesian grid to the same accuracy but with 29.3% less samples. This speeds up traditional volume rendering algorithms by the same ratio, which we demonstrate by adopting a splatting implementation for these new lattices. We investigate different filtering methods required for computing the normals on this lattice. The lattice representation results also in lossless compression ratios that are better than previously reported. Although other regular grid structures achieve the same sample efficiency, the bodycentered cubic is particularly easy to use. The only assumption necessary is that the underlying volume is isotropic and bandlimited  an assumption that is valid for most practical data sets.
Reconstruction Schemes for High Quality Raycasting of the BodyCentered Cubic Grid
"... The bodycentered cubic (BCC) grid has received attention in the volume visualization community recently due to its ability to represent the same data with almost 30% fewer samples as compared to the Cartesian cubic (CC) grid. In this paper we present several resampling strategies for raycasting BCC ..."
Abstract

Cited by 12 (5 self)
 Add to MetaCart
The bodycentered cubic (BCC) grid has received attention in the volume visualization community recently due to its ability to represent the same data with almost 30% fewer samples as compared to the Cartesian cubic (CC) grid. In this paper we present several resampling strategies for raycasting BCC grids. These strategies range from 2D interpolation in planes to piecewise linear (barycentric) interpolation in a tetrahedral decomposition of the grid to trilinear and sheared trilinear interpolation. We compare them to raycasting with comparable resampling techniques in the commonly used CC grid in terms of computational complexity and visual quality. 1
The Practice of Finitism: Epsilon Calculus and Consistency Proofs in Hilbert's Program
, 2001
"... . After a brief flirtation with logicism in 19171920, David Hilbert proposed his own program in the foundations of mathematics in 1920 and developed it, in concert with collaborators such as Paul Bernays and Wilhelm Ackermann, throughout the 1920s. The two technical pillars of the project were the ..."
Abstract

Cited by 5 (3 self)
 Add to MetaCart
. After a brief flirtation with logicism in 19171920, David Hilbert proposed his own program in the foundations of mathematics in 1920 and developed it, in concert with collaborators such as Paul Bernays and Wilhelm Ackermann, throughout the 1920s. The two technical pillars of the project were the development of axiomatic systems for ever stronger and more comprehensive areas of mathematics and finitistic proofs of consistency of these systems. Early advances in these areas were made by Hilbert (and Bernays) in a series of lecture courses at the University of Gttingen between 1917 and 1923, and notably in Ackermann 's dissertation of 1924. The main innovation was the invention of the ecalculus, on which Hilbert's axiom systems were based, and the development of the esubstitution method as a basis for consistency proofs. The paper traces the development of the "simultaneous development of logic and mathematics" through the enotation and provides an analysis of Ackermann's consisten...
Hilbert’s Program Then and Now
, 2005
"... Hilbert’s program is, in the first instance, a proposal and a research program in the philosophy and foundations of mathematics. It was formulated in the early 1920s by German mathematician David Hilbert (1862–1943), and was pursued by him and his collaborators at the University of Göttingen and els ..."
Abstract

Cited by 4 (0 self)
 Add to MetaCart
Hilbert’s program is, in the first instance, a proposal and a research program in the philosophy and foundations of mathematics. It was formulated in the early 1920s by German mathematician David Hilbert (1862–1943), and was pursued by him and his collaborators at the University of Göttingen and elsewhere in the 1920s
Neural networks for financial time series prediction: Overview over recent research
, 2002
"... ..."
Quantum Field Theory as Eigenvalue Problem
, 2003
"... A mathematically wellde ned, manifestly covariant theory of classical and quantum eld is given, based on Euclidean Poisson algebras and a generalization of the Ehrenfest equation, which implies the stationary action principle. The theory opens a constructive spectral approach to nding physical st ..."
Abstract

Cited by 1 (1 self)
 Add to MetaCart
A mathematically wellde ned, manifestly covariant theory of classical and quantum eld is given, based on Euclidean Poisson algebras and a generalization of the Ehrenfest equation, which implies the stationary action principle. The theory opens a constructive spectral approach to nding physical states both in relativistic quantum eld theories and for exible phenomenological fewparticle approximations.
EQUIDECOMPOSABILITY (SCISSORS CONGRUENCE) OF POLYHEDRA IN R3 AND R4 IS ALGORITHMICALLY DECIDABLE: HILBERT'S 3rd PROBLEM REVISITED
"... Hilbert's third problem: brief reminder. It is known that in a plane, every two polygons P and P ′ of equal area A(P) = A(P ′) are scissors congruent (equidecomposable) – i.e., they can be both decomposed into the same nite number of pairwise congruent polygonal pieces: P = P1 ∪... ∪ Pp, P ′ = P ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
Hilbert's third problem: brief reminder. It is known that in a plane, every two polygons P and P ′ of equal area A(P) = A(P ′) are scissors congruent (equidecomposable) – i.e., they can be both decomposed into the same nite number of pairwise congruent polygonal pieces: P = P1 ∪... ∪ Pp, P ′ = P ′ 1 ∪... ∪ P ′ p, and Pi ∼ P ′ i. In one of the 23 problems that D. Hilbert formulated in 1900 as a challenge to the 20 century mathematics, namely, in Problem No. 3, Hilbert asked whether every two polyhedra P and P ′ with the same volume V (P) = V (P ′ ) are equidecomposable (Hilbert 1900). This problem was the rst to be solved: already in 1900, Dehn proved (Dehn 1900, Dehn 1901) that there exist a tetrahedron of volume 1 which is not scissors congruent with a unit cube. He proved it by showing that for every additive function f: R → R from real numbers to real numbers for which f(2π) = 0, the expression
Proof Nets for Intuitionistic Logic
 SAARBRÜCKEN, GERMANY
, 2006
"... Until the beginning of the 20th century, there was no way to reason formally about proofs. In particular, the question of proof equivalence had never been explored. When Hilbert asked in 1920 for an answer to this very question in his famous program, people started looking for proof formalizations.
..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
Until the beginning of the 20th century, there was no way to reason formally about proofs. In particular, the question of proof equivalence had never been explored. When Hilbert asked in 1920 for an answer to this very question in his famous program, people started looking for proof formalizations.
Natural deduction and sequent calculi, which were invented by Gentzen in 1935, quickly became two of the main tools for the study of proofs. Gentzen’s Hauptsatz on normal forms for his sequent calculi, and later on Prawitz’ analog theorem for natural deduction, put forth a first notion of equivalent proofs in intuitionistic and classical logic.
However, natural deduction only works well for intuitionistic logic. This is why Girard invented proof nets in 1986 as an analog to natural deduction for (the multiplicative fragment of) linear logic. Their universal structure made proof nets also interesting for other logics. Proof nets have the great advantage that they eliminate most of the bureaucracy involved in deductive systems and so are probably closer to the essence of a proof. There has recently been an increasing interest in the development of proof nets for various kinds of logics. In 2005 for example, Lamarche and Straßburger were able to express sequent proofs in classical logic as proof nets.
In this thesis, I will, starting from proof nets for classical logic, turn the focus back on intuitionistic logic and propose proof nets that are suited as an extension of natural deduction. I will examine these nets and characterize those corresponding to natural deduction proofs. Additionally, I provide a cut elimination procedure for the new proof nets and prove termination and confluence for this reduction system, thus effectively a new notion of the equivalence of intuitionistic proofs.
Identity and Reference in webbased Knowledge Representation (IRKR)
, 2009
"... The Semantic Web initiative has brought forward the idea that the web may become a space not only for publishing and interlinking documents (through HTML hyperlinks), but also for publishing and interlinking knowledge bases (e.g. in the ..."
Abstract
 Add to MetaCart
The Semantic Web initiative has brought forward the idea that the web may become a space not only for publishing and interlinking documents (through HTML hyperlinks), but also for publishing and interlinking knowledge bases (e.g. in the