Results 1  10
of
23,531
The Dantzig selector: statistical estimation when p is much larger than n
, 2005
"... In many important statistical applications, the number of variables or parameters p is much larger than the number of observations n. Suppose then that we have observations y = Ax + z, where x ∈ R p is a parameter vector of interest, A is a data matrix with possibly far fewer rows than columns, n ≪ ..."
Abstract

Cited by 879 (14 self)
 Add to MetaCart
In many important statistical applications, the number of variables or parameters p is much larger than the number of observations n. Suppose then that we have observations y = Ax + z, where x ∈ R p is a parameter vector of interest, A is a data matrix with possibly far fewer rows than columns, n
Critical values for cointegration tests
 Eds.), LongRun Economic Relationship: Readings in Cointegration
, 1991
"... This paper provides tables of critical values for some popular tests of cointegration and unit roots. Although these tables are necessarily based on computer simulations, they are much more accurate than those previously available. The results of the simulation experiments are summarized by means of ..."
Abstract

Cited by 506 (3 self)
 Add to MetaCart
This paper provides tables of critical values for some popular tests of cointegration and unit roots. Although these tables are necessarily based on computer simulations, they are much more accurate than those previously available. The results of the simulation experiments are summarized by means
Choices, values and frames.
 American Psychologist,
, 1984
"... Making decisions is like speaking prosepeople do it all the time, knowingly or unknowingly. It is hardly surprising, then, that the topic of decision making is shared by many disciplines, from mathematics and statistics, through economics and political science, to sociology and psychology. The stu ..."
Abstract

Cited by 684 (9 self)
 Add to MetaCart
. The tension between normative and descriptive considerations characterizes much of the study of judgment and choice. Analyses of decision making commonly distinguish risky and riskless choices. The paradigmatic example of decision under risk is the acceptability of a gamble that yields monetary outcomes
A direct approach to false discovery rates
, 2002
"... Summary. Multiplehypothesis testing involves guarding against much more complicated errors than singlehypothesis testing. Whereas we typically control the type I error rate for a singlehypothesis test, a compound error rate is controlled for multiplehypothesis tests. For example, controlling the ..."
Abstract

Cited by 775 (14 self)
 Add to MetaCart
Summary. Multiplehypothesis testing involves guarding against much more complicated errors than singlehypothesis testing. Whereas we typically control the type I error rate for a singlehypothesis test, a compound error rate is controlled for multiplehypothesis tests. For example, controlling
Bayes Factors
, 1995
"... In a 1935 paper, and in his book Theory of Probability, Jeffreys developed a methodology for quantifying the evidence in favor of a scientific theory. The centerpiece was a number, now called the Bayes factor, which is the posterior odds of the null hypothesis when the prior probability on the null ..."
Abstract

Cited by 1826 (74 self)
 Add to MetaCart
is onehalf. Although there has been much discussion of Bayesian hypothesis testing in the context of criticism of P values, less attention has been given to the Bayes factor as a practical tool of applied statistics. In this paper we review and discuss the uses of Bayes factors in the context of five
Valgrind: A framework for heavyweight dynamic binary instrumentation
 In Proceedings of the 2007 Programming Language Design and Implementation Conference
, 2007
"... Dynamic binary instrumentation (DBI) frameworks make it easy to build dynamic binary analysis (DBA) tools such as checkers and profilers. Much of the focus on DBI frameworks has been on performance; little attention has been paid to their capabilities. As a result, we believe the potential of DBI ha ..."
Abstract

Cited by 558 (5 self)
 Add to MetaCart
Dynamic binary instrumentation (DBI) frameworks make it easy to build dynamic binary analysis (DBA) tools such as checkers and profilers. Much of the focus on DBI frameworks has been on performance; little attention has been paid to their capabilities. As a result, we believe the potential of DBI
Data Security
, 1979
"... The rising abuse of computers and increasing threat to personal privacy through data banks have stimulated much interest m the techmcal safeguards for data. There are four kinds of safeguards, each related to but distract from the others. Access controls regulate which users may enter the system and ..."
Abstract

Cited by 615 (3 self)
 Add to MetaCart
The rising abuse of computers and increasing threat to personal privacy through data banks have stimulated much interest m the techmcal safeguards for data. There are four kinds of safeguards, each related to but distract from the others. Access controls regulate which users may enter the system
Inducing Features of Random Fields
 IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE
, 1997
"... We present a technique for constructing random fields from a set of training samples. The learning paradigm builds increasingly complex fields by allowing potential functions, or features, that are supported by increasingly large subgraphs. Each feature has a weight that is trained by minimizing the ..."
Abstract

Cited by 670 (10 self)
 Add to MetaCart
the KullbackLeibler divergence between the model and the empirical distribution of the training data. A greedy algorithm determines how features are incrementally added to the field and an iterative scaling algorithm is used to estimate the optimal values of the weights. The random field models and techniques
On the Use of Windows for Harmonic Analysis With the Discrete Fourier Transform
 Proc. IEEE
, 1978
"... AhmwThis Pw!r mak = available a concise review of data win compromise consists of applying windows to the sampled daws pad the ^ affect On the Of in the data set, or equivalently, smoothing the spectral samples. '7 of aoise9 m the ptesence of sdroag bar The two operations to which we subject ..."
Abstract

Cited by 668 (0 self)
 Add to MetaCart
, windowing is less related to sampled windows for DFT's. HERE IS MUCH signal processing devoted to detection and estimation. Detection is the task of determiningif a specific signal set is present in an observation, while estimation is the task of obtaining the values of the parameters
Clustering by passing messages between data points
 Science
, 2007
"... Clustering data by identifying a subset of representative examples is important for processing sensory signals and detecting patterns in data. Such “exemplars ” can be found by randomly choosing an initial subset of data points and then iteratively refining it, but this works well only if that initi ..."
Abstract

Cited by 696 (8 self)
 Add to MetaCart
if that initial choice is close to a good solution. We devised a method called “affinity propagation,” which takes as input measures of similarity between pairs of data points. Realvalued messages are exchanged between data points until a highquality set of exemplars and corresponding clusters gradually emerges
Results 1  10
of
23,531