Results 1  10
of
16
Optimal Regular Volume Sampling
"... The classification of volumetric data sets as well as their rendering algorithms are typically based on the representation of the underlying grid. Grid structures based on a Cartesian lattice are the defacto standard for regular representations of volumetric data. In this paper we introduce a more ..."
Abstract

Cited by 48 (9 self)
 Add to MetaCart
The classification of volumetric data sets as well as their rendering algorithms are typically based on the representation of the underlying grid. Grid structures based on a Cartesian lattice are the defacto standard for regular representations of volumetric data. In this paper we introduce a more general concept of regular grids for the representation of volumetric data. We demonstrate that a specific type of regular lattice  the socalled bodycentered cubic  is able to represent the same data set as a Cartesian grid to the same accuracy but with 29.3% less samples. This speeds up traditional volume rendering algorithms by the same ratio, which we demonstrate by adopting a splatting implementation for these new lattices. We investigate different filtering methods required for computing the normals on this lattice. The lattice representation results also in lossless compression ratios that are better than previously reported. Although other regular grid structures achieve the same sample efficiency, the bodycentered cubic is particularly easy to use. The only assumption necessary is that the underlying volume is isotropic and bandlimited  an assumption that is valid for most practical data sets.
Reconstruction Schemes for High Quality Raycasting of the BodyCentered Cubic Grid
"... The bodycentered cubic (BCC) grid has received attention in the volume visualization community recently due to its ability to represent the same data with almost 30% fewer samples as compared to the Cartesian cubic (CC) grid. In this paper we present several resampling strategies for raycasting BCC ..."
Abstract

Cited by 12 (5 self)
 Add to MetaCart
The bodycentered cubic (BCC) grid has received attention in the volume visualization community recently due to its ability to represent the same data with almost 30% fewer samples as compared to the Cartesian cubic (CC) grid. In this paper we present several resampling strategies for raycasting BCC grids. These strategies range from 2D interpolation in planes to piecewise linear (barycentric) interpolation in a tetrahedral decomposition of the grid to trilinear and sheared trilinear interpolation. We compare them to raycasting with comparable resampling techniques in the commonly used CC grid in terms of computational complexity and visual quality. 1
Hilbert’s Program Then and Now
, 2005
"... Hilbert’s program is, in the first instance, a proposal and a research program in the philosophy and foundations of mathematics. It was formulated in the early 1920s by German mathematician David Hilbert (1862–1943), and was pursued by him and his collaborators at the University of Göttingen and els ..."
Abstract

Cited by 10 (0 self)
 Add to MetaCart
Hilbert’s program is, in the first instance, a proposal and a research program in the philosophy and foundations of mathematics. It was formulated in the early 1920s by German mathematician David Hilbert (1862–1943), and was pursued by him and his collaborators at the University of Göttingen and elsewhere in the 1920s
The Practice of Finitism: Epsilon Calculus and Consistency Proofs in Hilbert's Program
, 2001
"... . After a brief flirtation with logicism in 19171920, David Hilbert proposed his own program in the foundations of mathematics in 1920 and developed it, in concert with collaborators such as Paul Bernays and Wilhelm Ackermann, throughout the 1920s. The two technical pillars of the project were the ..."
Abstract

Cited by 6 (3 self)
 Add to MetaCart
(Show Context)
. After a brief flirtation with logicism in 19171920, David Hilbert proposed his own program in the foundations of mathematics in 1920 and developed it, in concert with collaborators such as Paul Bernays and Wilhelm Ackermann, throughout the 1920s. The two technical pillars of the project were the development of axiomatic systems for ever stronger and more comprehensive areas of mathematics and finitistic proofs of consistency of these systems. Early advances in these areas were made by Hilbert (and Bernays) in a series of lecture courses at the University of Gttingen between 1917 and 1923, and notably in Ackermann 's dissertation of 1924. The main innovation was the invention of the ecalculus, on which Hilbert's axiom systems were based, and the development of the esubstitution method as a basis for consistency proofs. The paper traces the development of the "simultaneous development of logic and mathematics" through the enotation and provides an analysis of Ackermann's consisten...
A Computable Kolmogorov Superposition Theorem
 In
, 2000
"... In the year 1900 in his famous lecture in Paris Hilbert formulated 23 challenging problems which inspired many ground breaking mathematical investigations in the last century. Among these problems the 13th was concerned with the solution of higher order algebraic equations. Hilbert conjectured that ..."
Abstract

Cited by 5 (0 self)
 Add to MetaCart
(Show Context)
In the year 1900 in his famous lecture in Paris Hilbert formulated 23 challenging problems which inspired many ground breaking mathematical investigations in the last century. Among these problems the 13th was concerned with the solution of higher order algebraic equations. Hilbert conjectured that such equations are not solvable by functions which can be constructed as substitution of continuous functions of only two variables. Surprisingly, 57 years later Hilbert's conjecture was refuted when Kolmogorov succeeded to prove his Superposition Theorem which states that each multivariate continuous realvalued function can be represented as superposition and composition of continuous functions of only one variable. Again 30 years later this theorem got an interesting application in the theory of neural networks. We will prove a computable version of Kolmogorov's Superposition Theorem in the rigorous framework of computable analysis: each multivariate computable realvalued function can be represented as superposition and composition of computable real number functions of only one variable and such a representation can even be determined eectively. As a consequence one obtains a characterization of the computational power of feedforward neural networks with computable activation functions and weights.
Proof theory of MartinLöf type theory. An overview
 MATHEMATIQUES ET SCIENCES HUMAINES, 42 ANNÉE, N O 165:59 – 99
, 2004
"... We give an overview over the historic development of proof theory and the main techniques used in ordinal theoretic proof theory. We argue, that in a revised Hilbert’s programme, ordinal theoretic proof theory has to be supplemented by a second step, namely the development of strong equiconsistent ..."
Abstract

Cited by 4 (2 self)
 Add to MetaCart
(Show Context)
We give an overview over the historic development of proof theory and the main techniques used in ordinal theoretic proof theory. We argue, that in a revised Hilbert’s programme, ordinal theoretic proof theory has to be supplemented by a second step, namely the development of strong equiconsistent constructive theories. Then we show, how, as part of such a programme, the proof theoretic analysis of MartinLöf type theory with Wtype and one microscopic universe containing only two finite sets in carried out. Then we look at the analysis MartinLöf theory with Wtype and a universe closed under the Wtype, and consider the extension of type theory by one Mahlo universe and its prooftheoretic analysis. Finally we repeat the concept of inductiverecursive definitions, which extends the notion of inductive definitions substantially. We introduce a closed formalisation, which can be used in generic programming, and explain, what is known about its strength.
Neural networks for financial time series prediction: Overview over recent research
, 2002
"... ..."
Proof Nets for Intuitionistic Logic
 SAARBRÜCKEN, GERMANY
, 2006
"... Until the beginning of the 20th century, there was no way to reason formally about proofs. In particular, the question of proof equivalence had never been explored. When Hilbert asked in 1920 for an answer to this very question in his famous program, people started looking for proof formalizations.
..."
Abstract

Cited by 2 (0 self)
 Add to MetaCart
Until the beginning of the 20th century, there was no way to reason formally about proofs. In particular, the question of proof equivalence had never been explored. When Hilbert asked in 1920 for an answer to this very question in his famous program, people started looking for proof formalizations.
Natural deduction and sequent calculi, which were invented by Gentzen in 1935, quickly became two of the main tools for the study of proofs. Gentzen’s Hauptsatz on normal forms for his sequent calculi, and later on Prawitz’ analog theorem for natural deduction, put forth a first notion of equivalent proofs in intuitionistic and classical logic.
However, natural deduction only works well for intuitionistic logic. This is why Girard invented proof nets in 1986 as an analog to natural deduction for (the multiplicative fragment of) linear logic. Their universal structure made proof nets also interesting for other logics. Proof nets have the great advantage that they eliminate most of the bureaucracy involved in deductive systems and so are probably closer to the essence of a proof. There has recently been an increasing interest in the development of proof nets for various kinds of logics. In 2005 for example, Lamarche and Straßburger were able to express sequent proofs in classical logic as proof nets.
In this thesis, I will, starting from proof nets for classical logic, turn the focus back on intuitionistic logic and propose proof nets that are suited as an extension of natural deduction. I will examine these nets and characterize those corresponding to natural deduction proofs. Additionally, I provide a cut elimination procedure for the new proof nets and prove termination and confluence for this reduction system, thus effectively a new notion of the equivalence of intuitionistic proofs.
EQUIDECOMPOSABILITY (SCISSORS CONGRUENCE) OF POLYHEDRA IN R3 AND R4 IS ALGORITHMICALLY DECIDABLE: HILBERT'S 3rd PROBLEM REVISITED
"... Hilbert's third problem: brief reminder. It is known that in a plane, every two polygons P and P ′ of equal area A(P) = A(P ′) are scissors congruent (equidecomposable) – i.e., they can be both decomposed into the same nite number of pairwise congruent polygonal pieces: P = P1 ∪... ∪ Pp, P ′ ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
Hilbert's third problem: brief reminder. It is known that in a plane, every two polygons P and P ′ of equal area A(P) = A(P ′) are scissors congruent (equidecomposable) – i.e., they can be both decomposed into the same nite number of pairwise congruent polygonal pieces: P = P1 ∪... ∪ Pp, P ′ = P ′ 1 ∪... ∪ P ′ p, and Pi ∼ P ′ i. In one of the 23 problems that D. Hilbert formulated in 1900 as a challenge to the 20 century mathematics, namely, in Problem No. 3, Hilbert asked whether every two polyhedra P and P ′ with the same volume V (P) = V (P ′ ) are equidecomposable (Hilbert 1900). This problem was the rst to be solved: already in 1900, Dehn proved (Dehn 1900, Dehn 1901) that there exist a tetrahedron of volume 1 which is not scissors congruent with a unit cube. He proved it by showing that for every additive function f: R → R from real numbers to real numbers for which f(2π) = 0, the expression
Quantum Field Theory as Eigenvalue Problem
, 2003
"... A mathematically wellde ned, manifestly covariant theory of classical and quantum eld is given, based on Euclidean Poisson algebras and a generalization of the Ehrenfest equation, which implies the stationary action principle. The theory opens a constructive spectral approach to nding physical st ..."
Abstract

Cited by 1 (1 self)
 Add to MetaCart
A mathematically wellde ned, manifestly covariant theory of classical and quantum eld is given, based on Euclidean Poisson algebras and a generalization of the Ehrenfest equation, which implies the stationary action principle. The theory opens a constructive spectral approach to nding physical states both in relativistic quantum eld theories and for exible phenomenological fewparticle approximations.