Results 1  10
of
20
Stochastic Perturbation Theory
, 1988
"... . In this paper classical matrix perturbation theory is approached from a probabilistic point of view. The perturbed quantity is approximated by a firstorder perturbation expansion, in which the perturbation is assumed to be random. This permits the computation of statistics estimating the variatio ..."
Abstract

Cited by 886 (35 self)
 Add to MetaCart
(Show Context)
. In this paper classical matrix perturbation theory is approached from a probabilistic point of view. The perturbed quantity is approximated by a firstorder perturbation expansion, in which the perturbation is assumed to be random. This permits the computation of statistics estimating the variation in the perturbed quantity. Up to the higherorder terms that are ignored in the expansion, these statistics tend to be more realistic than perturbation bounds obtained in terms of norms. The technique is applied to a number of problems in matrix perturbation theory, including least squares and the eigenvalue problem. Key words. perturbation theory, random matrix, linear system, least squares, eigenvalue, eigenvector, invariant subspace, singular value AMS(MOS) subject classifications. 15A06, 15A12, 15A18, 15A52, 15A60 1. Introduction. Let A be a matrix and let F be a matrix valued function of A. Two principal problems of matrix perturbation theory are the following. Given a matrix E, pr...
Constrained optimization in seismic reflection tomography: an SQP augmented Lagrangian approach, in "Geophysical Journal International
, 2005
"... Geophysical methods for imaging a complex geological subsurface in petroleum exploration requires the determination of an accurate wave propagation velocity model. Seismic reflection tomography turns out to be an efficient method for doing this: it determines the ..."
Abstract

Cited by 9 (1 self)
 Add to MetaCart
Geophysical methods for imaging a complex geological subsurface in petroleum exploration requires the determination of an accurate wave propagation velocity model. Seismic reflection tomography turns out to be an efficient method for doing this: it determines the
The Epic Story of Maximum Likelihood
, 2008
"... At a superficial level, the idea of maximum likelihood must be prehistoric: early hunters and gatherers may not have used the words “method of maximum likelihood ” to describe their choice of where and how to hunt and gather, but it is hard to believe they would have been surprised if their method ..."
Abstract

Cited by 9 (0 self)
 Add to MetaCart
At a superficial level, the idea of maximum likelihood must be prehistoric: early hunters and gatherers may not have used the words “method of maximum likelihood ” to describe their choice of where and how to hunt and gather, but it is hard to believe they would have been surprised if their method had been described in those terms. It seems a simple, even unassailable idea: Who would rise to argue in favor of a method of minimum likelihood, or even mediocre likelihood? And yet the mathematical history of the topic shows this “simple idea ” is really anything but simple. Joseph Louis Lagrange, Daniel Bernoulli, Leonard Euler, Pierre Simon Laplace and Carl Friedrich Gauss are only some of those who explored the topic, not always in ways we would sanction today. In this article, that history is reviewed from back well before Fisher to the time of Lucien Le Cam’s dissertation. In the process Fisher’s unpublished 1930 characterization of conditions for the consistency and efficiency of maximum likelihood estimates is presented, and the mathematical basis of his three proofs discussed. In particular, Fisher’s derivation of the information inequality is seen to be derived from his work on the analysis of variance, and his later approach via estimating functions was derived from Euler’s Relation for homogeneous functions. The reaction to Fisher’s work is reviewed, and some lessons drawn.
Geometric methods for state space identification
 In Identification, Adaptation, Learning  The Science of Learning Models from Data, NATO ASI Series F
, 1996
"... The scope of identification theory is to construct algorithms for automatic model building from observed data. In these lectures we shall only discuss the case where the data are collected in one irrepetible experiment and no preparation of the experiment is possible (i.e. we cannot choose the exper ..."
Abstract

Cited by 8 (3 self)
 Add to MetaCart
(Show Context)
The scope of identification theory is to construct algorithms for automatic model building from observed data. In these lectures we shall only discuss the case where the data are collected in one irrepetible experiment and no preparation of the experiment is possible (i.e. we cannot choose the experimental
Polyhedral Approaches to Mixed Integer Linear Programming
, 2008
"... This survey presents tools from polyhedral theory that are used in integer programming. It applies them to the study of valid inequalities for mixed integer linear sets, such as Gomory’s mixed integer cuts. ..."
Abstract

Cited by 7 (1 self)
 Add to MetaCart
This survey presents tools from polyhedral theory that are used in integer programming. It applies them to the study of valid inequalities for mixed integer linear sets, such as Gomory’s mixed integer cuts.
Fisher and Regression
"... Abstract. In 1922 R. A. Fisher introduced the modern regression model, synthesizing the regression theory of Pearson and Yule and the least squares theory of Gauss. The innovation was based on Fisher’s realization that the distribution associated with the regression coefficient was unaffected by the ..."
Abstract

Cited by 5 (3 self)
 Add to MetaCart
Abstract. In 1922 R. A. Fisher introduced the modern regression model, synthesizing the regression theory of Pearson and Yule and the least squares theory of Gauss. The innovation was based on Fisher’s realization that the distribution associated with the regression coefficient was unaffected by the distribution of X. Subsequently Fisher interpreted the fixed X assumption in terms of his notion of ancillarity. This paper considers these developments against the background of the development of statistical theory in the early twentieth century.
CONTENTS From Homo Sapiens to the Renaissance The Renaissance
"... Decision analysis emerged as a discipline separate from decision theory or operations research following World War II. It could not have emerged earlier. It required a stable society with the appropriate philosophy and culture, and a sufficiently rich mathematical language to think logically about d ..."
Abstract
 Add to MetaCart
Decision analysis emerged as a discipline separate from decision theory or operations research following World War II. It could not have emerged earlier. It required a stable society with the appropriate philosophy and culture, and a sufficiently rich mathematical language to think logically about decision making. It required the understanding of subjective probability of Ramsey (1926) and de Finetti (1931, 1937), and the appropriate measure of preference under uncertainty of von Neumann and Morgenstern (1947) and later of Savage (1954). It formally came into being with the naming by Howard and his
CHALLENGING THE LOGIC OF LEASTSQUARES METHODS FOR GEODETIC ESTIMATION PROBLEMS
"... Leastsquares estimation methods are perhaps the most widely used tool in all fields of geodetic research. Nevertheless, their prevailing use is not often complemented by a widespread objective view of their rudiments. Within the standard formalism of the leastsquares estimation theory there are ac ..."
Abstract
 Add to MetaCart
Leastsquares estimation methods are perhaps the most widely used tool in all fields of geodetic research. Nevertheless, their prevailing use is not often complemented by a widespread objective view of their rudiments. Within the standard formalism of the leastsquares estimation theory there are actually several paradoxical and curious issues which are seldom explicitly formulated. The aim of this expository paper is to present some of these issues and to discuss their implications for geodetic data analysis and parameter estimation problems. « De tous les principes qu’on peut proposer pour cet objet, je pense qu’il n’en est pas de plus general, de plus exact, ni d’une application plus facile que celui qui consiste à rendre minimum la somme de carrés des erreurs. » AdrienMarie Legendre [1805]
The Quantitative Methods for Psychology
"... GRD: An SPSS extension command for generating random data Bradley Harding , a, Denis Cousineau a ..."
Abstract
 Add to MetaCart
GRD: An SPSS extension command for generating random data Bradley Harding , a, Denis Cousineau a