Results 1  10
of
180
Some Aspects of the Sequential Design of Experiments
 Bulletin of the American Mathematical Society
"... to the design and analysis of sampling experiments in which the size and composition of the samples are completely determined before the experimentation begins. The reasons for this are partly historical, dating back to the time when the statistician was consulted, ..."
Abstract

Cited by 267 (0 self)
 Add to MetaCart
to the design and analysis of sampling experiments in which the size and composition of the samples are completely determined before the experimentation begins. The reasons for this are partly historical, dating back to the time when the statistician was consulted,
Prior Probabilities
 IEEE Transactions on Systems Science and Cybernetics
, 1968
"... e case of location and scale parameters, rate constants, and in Bernoulli trials with unknown probability of success. In realistic problems, both the transformation group analysis and the principle of maximum entropy are needed to determine the prior. The distributions thus found are uniquely determ ..."
Abstract

Cited by 165 (3 self)
 Add to MetaCart
e case of location and scale parameters, rate constants, and in Bernoulli trials with unknown probability of success. In realistic problems, both the transformation group analysis and the principle of maximum entropy are needed to determine the prior. The distributions thus found are uniquely determined by the prior information, independently of the choice of parameters. In a certain class of problems, therefore, the prior distributions may now be claimed to be fully as "objective" as the sampling distributions. I. Background of the problem Since the time of Laplace, applications of probability theory have been hampered by difficulties in the treatment of prior information. In realistic problems of decision or inference, we often have prior information which is highly relevant to the question being asked; to fail to take it into account is to commit the most obvious inconsistency of reasoning and may lead to absurd or dangerously misleading results. As an extreme examp
Scalable Feature Selection, Classification and Signature Generation for Organizing Large Text Databases Into Hierarchical Topic Taxonomies
, 1998
"... We explore how to organize large text databases hierarchically by topic to aid better searching, browsing and filtering. Many corpora, such as internet directories, digital libraries, and patent databases are manually organized into topic hierarchies, also called taxonomies. Similar to indices for r ..."
Abstract

Cited by 105 (7 self)
 Add to MetaCart
We explore how to organize large text databases hierarchically by topic to aid better searching, browsing and filtering. Many corpora, such as internet directories, digital libraries, and patent databases are manually organized into topic hierarchies, also called taxonomies. Similar to indices for relational data, taxonomies make search and access more efficient. However, the exponential growth in the volume of online textual information makes it nearly impossible to maintain such taxonomic organization for large, fastchanging corpora by hand. We describe an automatic system that starts with a small sample of the corpus in which topics have been assigned by hand, and then updates the database with new documents as the corpus grows, assigning topics to these new documents with high speed and accuracy. To do this, we use techniques from statistical pattern recognition to efficiently separate the feature words, or...
Possibility Theory as a Basis for Qualitative Decision Theory
, 1995
"... A counterpart to von Neumann and Morgenstern' expected utility theory is proposed in the framework of possibility theory. The existence of a utility function, representing a preference ordering among possibility distributions (on the consequences of decisionmaker's actions) that satisfies a series ..."
Abstract

Cited by 99 (25 self)
 Add to MetaCart
A counterpart to von Neumann and Morgenstern' expected utility theory is proposed in the framework of possibility theory. The existence of a utility function, representing a preference ordering among possibility distributions (on the consequences of decisionmaker's actions) that satisfies a series of axioms pertaining to decisionmaker's behavior, is established. The obtained utility is a generalization of Wald's criterion, which is recovered in case of total ignorance; when ignorance is only partial, the utility takes into account the fact that some situations are more plausible than others. Mathematically, the qualitative utility is nothing but the necessity measure of a fuzzy event in the sense of possibility theory (a socalled Sugeno integral). The possibilistic representation of uncertainty, which only requires a linearly ordered scale, is qualitative in nature. Only max, min and orderreversing operations are used on the scale. The axioms express a riskaverse behavior of the d...
Using taxonomy, discriminants, and signatures for navigating in text databases
 In Proceedings of the 23rd VLDB Conference
, 1997
"... We explore how to organize a text database hierarchically to aid better searching and browsing. We propose to exploit the natural hierarchy of topics, or taxonomy, that many corpora,suchas internet directories, digital libraries, and patent databases enjoy. In our system, the user navigates through ..."
Abstract

Cited by 76 (5 self)
 Add to MetaCart
We explore how to organize a text database hierarchically to aid better searching and browsing. We propose to exploit the natural hierarchy of topics, or taxonomy, that many corpora,suchas internet directories, digital libraries, and patent databases enjoy. In our system, the user navigates through the query response not as a at unstructured list, but embedded in the familiar taxonomy, and annotated with document signatures computed dynamically with respect to where the user is located at any time. Weshowhowto update such databases with new documents with high speed and accuracy. Weuse techniques from statistical pattern recognition to e ciently separate the feature words or discriminants from the noise words at each node of the taxonomy. Using these, we build a multilevel classi er. At each node, this classi er can ignore the large number of noise words in a document. Thus the classi er has a small model size and is very fast. However, owing to the use of contextsensitive features, the classi er is very accurate. We report on experiences with the Reuters newswire benchmark, the US Patent database, and web document samples from Yahoo!. 1
Robust portfolio rules and asset pricing
, 1999
"... Parameter uncertainty or, more broadly, model uncertainty seems highly relevant in many aspects of financial decisionmaking. I explore the effects of such uncertainty on dynamic portfolio and consumption decisions, and on equilibrium asset prices. In particular, I use the framework of Anderson, Ha ..."
Abstract

Cited by 46 (0 self)
 Add to MetaCart
Parameter uncertainty or, more broadly, model uncertainty seems highly relevant in many aspects of financial decisionmaking. I explore the effects of such uncertainty on dynamic portfolio and consumption decisions, and on equilibrium asset prices. In particular, I use the framework of Anderson, Hansen and Sargent (1999), which attributes a preference for robustness to the decisionmaker. Worried that the model she uses is misspecified, a robust agent seeks decision rules that insure against some worstcase misspecification, in accordance with maxmin expected utility. I demonstrate that robustness dramatically decreases the portfolio demand for equities. When modifying the framework of Anderson, Hansen and Sargent to impose homotheticity, I find robustness to be observationally equivalent to recursive preferences: robustness increases risk aversion, without affecting the willingness to substitute intertemporally. When investment opportunity sets are timevarying, robustness leads to an additional hedgingtype asset demand, even for logarithmic utility. In an equilibrium exchange economy, robustness increases the equilibrium equity premium. The endogenous worstcase scenario for equity returns supporting the equilibrium is shown to be the equilibrium return generated by a model without robustness. Because of this, matching both the equity premium and the riskfree rate is challenging.
The Asymptotic Efficiency Of Simulation Estimators
 Operations Research
, 1992
"... A decisiontheoretic framework is proposed for evaluating the efficiency of simulation estimators. The framework includes the cost of obtaining the estimate as well as the cost of acting based on the estimate. The cost of obtaining the estimate and the estimate itself are represented as realizations ..."
Abstract

Cited by 43 (14 self)
 Add to MetaCart
A decisiontheoretic framework is proposed for evaluating the efficiency of simulation estimators. The framework includes the cost of obtaining the estimate as well as the cost of acting based on the estimate. The cost of obtaining the estimate and the estimate itself are represented as realizations of jointly distributed stochastic processes. In this context, the efficiency of a simulation estimator based on a given computational budget is defined as the reciprocal of the risk (the overall expected cost). This framework is appealing philosophically, but it is often difficult to apply in practice (e.g., to compare the efficiency of two different estimators) because only rarely can the efficiency associated with a given computational budget be calculated. However, a useful practical framework emerges in a large sample context when we consider the limiting behavior as the computational budget increases. A limit theorem established for this model supports and extends a fairly well known e...
2003), “Policy Evaluation in Uncertain Economic Environments (with discussion
 Brookings Papers on Economic Activity
"... It will be remembered that the seventy translators of the Septuagint were shut up in seventy separate rooms with the Hebrew text and brought out with them, when they emerged, seventy identical translations. Would the same miracle be vouchsafed if seventy multiple correlators were shut up with the sa ..."
Abstract

Cited by 31 (5 self)
 Add to MetaCart
It will be remembered that the seventy translators of the Septuagint were shut up in seventy separate rooms with the Hebrew text and brought out with them, when they emerged, seventy identical translations. Would the same miracle be vouchsafed if seventy multiple correlators were shut up with the same statistical material? And anyhow, I suppose, if each had a different economist perched on his a priori, that would make a difference to the outcome. 1 This paper describes some approaches to macroeconomic policy evaluation in the presence of uncertainty about the structure of the economic environment under study. The perspective we discuss is designed to facilitate policy evaluation for several forms of uncertainty. For example, our approach may be used when an analyst is unsure about the appropriate economic theory that should be assumed to apply, or about the particular functional forms that translate a general theory into a form amenable to statistical analysis. As such, the methods we describe are, we believe, particularly useful in a range of macroeconomic contexts where fundamental disagreements exist as to the determinants of the problem under study. In addition, this approach recognizes that even if economists agree on the
Qualitative decision theory: from Savage’s axioms to nonmonotonic reasoning
 Journal of the ACM
, 2002
"... Abstract: This paper investigates to what extent a purely symbolic approach to decision making under uncertainty is possible, in the scope of Artificial Intelligence. Contrary to classical approaches to decision theory, we try to rank acts without resorting to any numerical representation of utility ..."
Abstract

Cited by 31 (0 self)
 Add to MetaCart
Abstract: This paper investigates to what extent a purely symbolic approach to decision making under uncertainty is possible, in the scope of Artificial Intelligence. Contrary to classical approaches to decision theory, we try to rank acts without resorting to any numerical representation of utility nor uncertainty, and without using any scale on which both uncertainty and preference could be mapped. Our approach is a variant of Savage's where the setting is finite, and the strict preference on acts is a partial order. It is shown that although many axioms of Savage theory are preserved and despite the intuitive appeal of the ordinal method for constructing a preference over acts, the approach is inconsistent with a probabilistic representation of uncertainty. The latter leads to the kind of paradoxes encountered in the theory of voting. It is shown that the assumption of ordinal invariance enforces a qualitative decision procedure that presupposes a comparative possibility representation of uncertainty, originally due to Lewis, and usual in nonmonotonic reasoning. Our axiomatic investigation thus provides decisiontheoretic foundations to preferential inference of Lehmann and colleagues. However, the obtained decision rules are sometimes either not very decisive or may lead to overconfident decisions, although their basic principles look sound. This paper points out some limitations of purely ordinal approaches to Savagelike decision making under uncertainty, in perfect analogy with similar difficulties in voting theory.
Centroid estimation in discrete highdimensional spaces with applications
 in biology,” Proceedings of the National Academy of Sciences
, 2008
"... in biology ..."