Results 1  10
of
84
A new look at Newton’s inequalities
 2000), Article 17. [ONLINE: http://jipam. vu.edu.au/v1n2/014_99.html
"... Communicated by A. Lupa¸s ABSTRACT. New families of inequalities involving the elementary symmetric functions are built as a consequence that all zeros of certain real polynomials are real numbers. ..."
Abstract

Cited by 30 (2 self)
 Add to MetaCart
(Show Context)
Communicated by A. Lupa¸s ABSTRACT. New families of inequalities involving the elementary symmetric functions are built as a consequence that all zeros of certain real polynomials are real numbers.
Optimized product quantization for approximate nearest neighbor search. CVPR
, 2013
"... Product quantization is an effective vector quantization approach to compactly encode highdimensional vectors for fast approximate nearest neighbor (ANN) search. The essence of product quantization is to decompose the original highdimensional space into the Cartesian product of a finite number of ..."
Abstract

Cited by 20 (6 self)
 Add to MetaCart
Product quantization is an effective vector quantization approach to compactly encode highdimensional vectors for fast approximate nearest neighbor (ANN) search. The essence of product quantization is to decompose the original highdimensional space into the Cartesian product of a finite number of lowdimensional subspaces that are then quantized separately. Optimal space decomposition is important for the performance of ANN search, but still remains unaddressed. In this paper, we optimize product quantization by minimizing quantization distortions w.r.t. the space decomposition and the quantization codebooks. We present two novel methods for optimization: a nonparametric method that alternatively solves two smaller subproblems, and a parametric method that is guaranteed to achieve the optimal solution if the input data follows some Gaussian distribution. We show by experiments that our optimized approach substantially improves the accuracy of product quantization for ANN search. 1.
ASplines: Local Interpolation and Approximation using C^kContinuous Piecewise Real Algebraic Curves
, 1992
"... We present a concise characterization of the BernsteinBezier (BB) form of an implicitly defined bivariate polynomial over a triangle, such that the zero contour of the polynomial defines a smooth and single sheeted real algebraic curve segment. We call a piecewise C k continuous chain of such re ..."
Abstract

Cited by 19 (15 self)
 Add to MetaCart
We present a concise characterization of the BernsteinBezier (BB) form of an implicitly defined bivariate polynomial over a triangle, such that the zero contour of the polynomial defines a smooth and single sheeted real algebraic curve segment. We call a piecewise C k continuous chain of such real algebraic curve segments in BBform as an Aspline (short for algebraic spline). We show how to construct cubic and quartic Asplines to locally interpolate and/or approximate the vertices of an arbitrary planar polygon with up to C 3 and C 5 continuity, respectively. Quadratic Asplines are always locally convex. We also prove that our cubic Asplines are also always locally convex. Additionally, we derive evaluation formulas for the efficient display of all these Asplines and present examples of their use in geometric modeling. 1 Introduction Designing fonts with piecewise smooth curves or fitting curves to scattered data for image reconstruction are just two of the diverse applic...
Who gave you the Cauchy–Weierstrass tale? The dual history of rigorous calculus
 FOUNDATIONS OF SCIENCE
, 2012
"... ..."
A lambda calculus for real analysis
, 2005
"... Abstract Stone Duality is a revolutionary theory that works directly with computable continuous functions, without using set theory, infinitary lattice theory or a prior theory of discrete computation. Every expression in the calculus denotes both a continuous function and a program, but the reasoni ..."
Abstract

Cited by 14 (0 self)
 Add to MetaCart
Abstract Stone Duality is a revolutionary theory that works directly with computable continuous functions, without using set theory, infinitary lattice theory or a prior theory of discrete computation. Every expression in the calculus denotes both a continuous function and a program, but the reasoning looks remarkably like a sanitised form of that in classical topology. This paper is an introduction to ASD for the general mathematician, and applies it to elementary real analysis. It culminates in the Intermediate Value Theorem, i.e. the solution of equations fx = 0 for continuous f: R → R. As is well known from both numerical and constructive considerations, the equation cannot be solved if f “hovers ” near 0, whilst tangential solutions will never be found. In ASD, both of these failures and the general method of finding solutions of the equation when they exist are explained by the new concept of “overtness”. The zeroes are captured, not as a set, but by highertype operators � and ♦ that remain (Scott) continuous across singularities of a parametric equation. Expressing topology in terms of continuous functions rather than sets of points leads to
Data Fitting with Cubic ASplines
 Proceedings of Computer Graphics International, CGI'94
, 1996
"... We present algorithms for constructing isocontours from image data or fitting scattered point data C 1 , C 2 or C 3 piecewise smooth chains of single sheeted real cubic algebraic curve segments called cubic Asplines (short for cubic algebraic splines). Using cubic Asplines we achieve data ..."
Abstract

Cited by 13 (7 self)
 Add to MetaCart
(Show Context)
We present algorithms for constructing isocontours from image data or fitting scattered point data C 1 , C 2 or C 3 piecewise smooth chains of single sheeted real cubic algebraic curve segments called cubic Asplines (short for cubic algebraic splines). Using cubic Asplines we achieve data fitting with either a higher order of continuity or greater local flexibility for fixed continuity, than numerous prior schemes. Key Words. isocontours, scattered points, curve fitting, algebraic splines, cubic 1 Introduction Generating contours in image data, reconstructing digitized signals, and designing scalable fonts are only some of the several applications of spline curve fitting techniques. In this paper, we generalize past fitting schemes with conic splines [4, 16, 17, 18] and even rational parametric splines [7, 13, 19], We exhibit efficient techniques to deal with cubic algebraic splines (Asplines) achieving fits with small number of pieces yet higher order of smoothness/contin...
Leibniz’s infinitesimals: Their fictionality, their modern implementations, and their foes from Berkeley to Russell and beyond
, 2012
"... ..."
Types in logic and mathematics before 1940
 Bulletin of Symbolic Logic
, 2002
"... Abstract. In this article, we study the prehistory of type theory up to 1910 and its development between Russell and Whitehead’s Principia Mathematica ([71], 1910–1912) and Church’s simply typed λcalculus of 1940. We first argue that the concept of types has always been present in mathematics, thou ..."
Abstract

Cited by 11 (5 self)
 Add to MetaCart
(Show Context)
Abstract. In this article, we study the prehistory of type theory up to 1910 and its development between Russell and Whitehead’s Principia Mathematica ([71], 1910–1912) and Church’s simply typed λcalculus of 1940. We first argue that the concept of types has always been present in mathematics, though nobody was incorporating them explicitly as such, before the end of the 19th century. Then we proceed by describing how the logical paradoxes entered the formal systems of Frege, Cantor and Peano concentrating on Frege’s Grundgesetze der Arithmetik for which Russell applied his famous paradox 1 and this led him to introduce the first theory of types, the Ramified Type Theory (rtt). We present rtt formally using the modern notation for type theory and we discuss how Ramsey, Hilbert and Ackermann removed the orders from rtt leading to the simple theory of types stt. We present stt and Church’s own simply typed λcalculus (λ→C 2) and we finish by comparing rtt, stt and λ→C. §1. Introduction. Nowadays, type theory has many applications and is used in many different disciplines. Even within logic and mathematics, there are many different type systems. They serve several purposes, and are formulated in various ways. But, before 1903 when Russell first introduced
On order invariant synthesizing functions
 J. Math. Psychol
, 2002
"... This paper gives a description of the class of continuous functions that are comparison meaningful in the sense of measurement theory. When idempotency is assumed, this class reduces to the Boolean maxmin functions (lattice polynomials). In that case, continuity can be replaced by increasing monot ..."
Abstract

Cited by 11 (6 self)
 Add to MetaCart
(Show Context)
This paper gives a description of the class of continuous functions that are comparison meaningful in the sense of measurement theory. When idempotency is assumed, this class reduces to the Boolean maxmin functions (lattice polynomials). In that case, continuity can be replaced by increasing monotonicity, provided the range of variables is open. The particular cases of order statistics and projection functions are also studied.