Results 1  10
of
108
Discrete Multivariate Analysis: Theory and Practice
, 1975
"... the collaboration of Richard J. Light and Frederick Mosteller. ..."
Abstract

Cited by 816 (47 self)
 Add to MetaCart
the collaboration of Richard J. Light and Frederick Mosteller.
The Consistency of Posterior Distributions in Nonparametric Problems
 Ann. Statist
, 1996
"... We give conditions that guarantee that the posterior probability of every Hellinger... ..."
Abstract

Cited by 132 (4 self)
 Add to MetaCart
We give conditions that guarantee that the posterior probability of every Hellinger...
Asymptotic optimality of empirical likelihood for testing moment restrictions
, 2001
"... We show by example that empirical likelihood and other commonly used tests for moment restrictions are unable to control the (exponential) rate at which the probability of a Type I error tends to zero. It follows that the optimality of empirical likelihood asserted in Kitamura (2001) does not hold w ..."
Abstract

Cited by 47 (5 self)
 Add to MetaCart
We show by example that empirical likelihood and other commonly used tests for moment restrictions are unable to control the (exponential) rate at which the probability of a Type I error tends to zero. It follows that the optimality of empirical likelihood asserted in Kitamura (2001) does not hold without additional assumptions. Under stronger assumptions than those in Kitamura (2001), we establish the following optimality result: (i) empirical likelihood controls the rate at which the probability of a Type I error tends to zero and (ii) among all procedures for which the probability of a Type I error tends to zero at least as fast, empirical likelihood maximizes the rate at which probability of a Type II error tends to zero for “most ” alternatives. This result further implies that empirical likelihood maximizes the rate at which probability of a Type II error tends to zero for all alternatives among a class of tests that satisfy a weaker criterion for their Type I error probabilities.
Empirical likelihoodbased inference in conditional moment restriction models
 Econometrica
, 2004
"... This paper proposes an asymptotically efficient method for estimating models with conditional moment restrictions. Our estimator generalizes the maximum empirical likelihood estimator (MELE) of Qin and Lawless (1994). Using a kernel smoothing method, we efficiently incorporate the information implie ..."
Abstract

Cited by 40 (2 self)
 Add to MetaCart
This paper proposes an asymptotically efficient method for estimating models with conditional moment restrictions. Our estimator generalizes the maximum empirical likelihood estimator (MELE) of Qin and Lawless (1994). Using a kernel smoothing method, we efficiently incorporate the information implied by the conditional moment restrictions into our empirical likelihoodbased procedure. This yields a onestep estimator which avoids estimating optimal instruments. Our likelihood ratiotype statistic for parametric restrictions does not require the estimation of variance, and achieves asymptotic pivotalness implicitly. The estimation and testing procedures we propose are normalization invariant. Simulation results suggest that our new estimator works remarkably well in finite samples.
Conditional limit theorems under Markov conditioning
 IEEE Trans. on Information Theory
, 1987
"... variables taking values in a finite set X and consider the conditional joint distribution of the first m elements of the sample Xt;.., X, on the condition that A’, = x, and the sliding block sample average of a function h (.,.) defined on X2 exceeds a threshold OL> Eh ( Xt, X2). For m fixed and ..."
Abstract

Cited by 39 (1 self)
 Add to MetaCart
variables taking values in a finite set X and consider the conditional joint distribution of the first m elements of the sample Xt;.., X, on the condition that A’, = x, and the sliding block sample average of a function h (.,.) defined on X2 exceeds a threshold OL> Eh ( Xt, X2). For m fixed and M + co, this conditional joint distribution is shown to converge to the mstep joint distribution of a Markov chain started in x1 which is closest to X,, X2, in KullbackLeibler information divergence among all Markov chains whose twodimensional stationary distribution P ( ,.) satisfies EP ( x, y) h ( x, y) 2 OL, provided some distribution P on X2 having equal marginals does satisfy this constraint with strict inequality. Similar conditional limit theorems are obtained when X,, X2,... is an arbitrary finiteorder Markov chain and more general conditioning is allowed. S I.
An informationspectrum approach to classical and quantum hypothesis testing for simple hypotheses
 IEEE TRANS. INFORM. THEORY
, 2006
"... The informationspectrum analysis made by Han for classical hypothesis testing for simple hypotheses is extended to a unifying framework including both classical and quantum hypothesis testing. The results are also applied to fixedlength source coding when loosening the normalizing condition for pr ..."
Abstract

Cited by 36 (17 self)
 Add to MetaCart
(Show Context)
The informationspectrum analysis made by Han for classical hypothesis testing for simple hypotheses is extended to a unifying framework including both classical and quantum hypothesis testing. The results are also applied to fixedlength source coding when loosening the normalizing condition for probability distributions and for quantum states. We establish general formulas for several quantities relating to the asymptotic optimality of tests/codes in terms of classical and quantum information spectra.
Asymptotic error rates in quantum hypothesis testing
 Commun. Math. Phys
, 2008
"... We consider the problem of discriminating between two different states of a finite quantum system in the setting of large numbers of copies, and find a closed form expression for the asymptotic exponential rate at which the specified error probability tends to zero. This leads to the identification ..."
Abstract

Cited by 36 (9 self)
 Add to MetaCart
(Show Context)
We consider the problem of discriminating between two different states of a finite quantum system in the setting of large numbers of copies, and find a closed form expression for the asymptotic exponential rate at which the specified error probability tends to zero. This leads to the identification of the quantum generalisation of the classical Chernoff distance, which is the corresponding quantity in classical symmetric hypothesis testing, thereby solving a long standing open problem. The proof relies on a new trace inequality for pairs of positive operators as well as on a special mapping from pairs of density operators to pairs of probability distributions. These two new techniques have been introduced in [quantph/0610027] and [quantph/0607216], respectively. They are also well suited to prove the quantum generalisation of the Hoeffding bound, which is a modification of the Chernoff distance and specifies the optimal achievable asymptotic error rate in the context
EMPIRICAL LIKELIHOOD METHODS IN ECONOMETRICS: THEORY AND PRACTICE
, 2006
"... Recent developments in empirical likelihood (EL) methods are reviewed. First, to put the method in perspective, two interpretations of empirical likelihood are presented, one as a nonparametric maximum likelihood estimation method (NPMLE) and the other as a generalized minimum contrast estimator ( ..."
Abstract

Cited by 35 (3 self)
 Add to MetaCart
Recent developments in empirical likelihood (EL) methods are reviewed. First, to put the method in perspective, two interpretations of empirical likelihood are presented, one as a nonparametric maximum likelihood estimation method (NPMLE) and the other as a generalized minimum contrast estimator (GMC). The latter interpretation provides a clear connection between EL, GMM, GEL and other related estimators. Second, EL is shown to have various advantages over other methods. The theory of large deviations demonstrates that EL emerges naturally in achieving asymptotic optimality both for estimation and testing. Interestingly, higher order asymptotic analysis also suggests that EL is generally a preferred method. Third, extensions of EL are discussed in various settings, including estimation of conditional moment restriction models, nonparametric specification testing and time series models. Finally, practical issues in applying EL to real data, such as computational algorithms for EL, are discussed. Numerical examples to illustrate the efficacy of the method are presented.
When is the generalized likelihood ratio test optimal
 IEEE Trans. Inform. Theory
, 1992
"... AbstractThe generalized likelihood ratio test (GLRT), which is commonly used in composite hypothesis testing problems, is investigated. Conditions for asymptotic optimality of the GLRT in the NeymanPearson sense are studied and discussed. First, a general necessary and sufficient condition is esta ..."
Abstract

Cited by 34 (4 self)
 Add to MetaCart
AbstractThe generalized likelihood ratio test (GLRT), which is commonly used in composite hypothesis testing problems, is investigated. Conditions for asymptotic optimality of the GLRT in the NeymanPearson sense are studied and discussed. First, a general necessary and sufficient condition is established, and then based on this, a sufficient condition, which is easier to verify, is derived. A counterexample, where the GLRT is not optimal, is provided as well. A conjecture is stated concerning the optimality of the GLRT for the class of finitestate sources. Index TermsHypothesis testing, generalizedlikelihood ratio test, maximumlikelihood test, error exponent, NeymanPearson criterion, large deviations. I.
Universal Composite Hypothesis Testing: A Competitve Minimax Approach
, 2001
"... A novel approach is presented for the longstanding problem of composite hypothesis testing. In composite hypothesis testing, unlike in simple hypothesis testing, the probability function of the observed data given the hypothesis, is uncertain as it depends on the unknown value of some parameter. Th ..."
Abstract

Cited by 30 (10 self)
 Add to MetaCart
(Show Context)
A novel approach is presented for the longstanding problem of composite hypothesis testing. In composite hypothesis testing, unlike in simple hypothesis testing, the probability function of the observed data given the hypothesis, is uncertain as it depends on the unknown value of some parameter. The proposed approach is to minimize the worstcase ratio between the probability of error of a decision rule that is independent of the unknown parameters and the minimum probability of error attainable given the parameters. The principal solution to this minimax problem is presented and the resulting decision rule is discussed. Since the exact solution is, in general, hard to find, and afortiori hard to implement, an approximation method that yields an asymptotically minimax decision rule is proposed. Finally, a variety of potential application areas are provided in signal processing and communications with special emphasis on universal decoding.