Results 1  10
of
168
Anomalies  Risk Aversion
 JOURNAL OF ECONOMIC PERSPECTIVES
, 2001
"... Economics can be distinguished from other social sciences by the belief that most (all?) behavior can be explained by assuming that rational agents with I stable, well
Abstract

Cited by 76 (2 self)
 Add to MetaCart
Economics can be distinguished from other social sciences by the belief that most (all?) behavior can be explained by assuming that rational agents with I stable, well<lefined preferences interact in markets that (eventually) clear. An empirical result qualifies as an anomaly if it i ~ difficult to "rationalize " or if implausible assumptions are necessary to explain it within the paradigm. Suggestions for future topics should be sent to Richard Thaler, c/oJournal of Economic
Soft Computing: the Convergence of Emerging Reasoning Technologies
 Soft Computing
, 1997
"... The term Soft Computing (SC) represents the combination of emerging problemsolving technologies such as Fuzzy Logic (FL), Probabilistic Reasoning (PR), Neural Networks (NNs), and Genetic Algorithms (GAs). Each of these technologies provide us with complementary reasoning and searching methods to so ..."
Abstract

Cited by 50 (8 self)
 Add to MetaCart
The term Soft Computing (SC) represents the combination of emerging problemsolving technologies such as Fuzzy Logic (FL), Probabilistic Reasoning (PR), Neural Networks (NNs), and Genetic Algorithms (GAs). Each of these technologies provide us with complementary reasoning and searching methods to solve complex, realworld problems. After a brief description of each of these technologies, we will analyze some of their most useful combinations, such as the use of FL to control GAs and NNs parameters; the application of GAs to evolve NNs (topologies or weights) or to tune FL controllers; and the implementation of FL controllers as NNs tuned by backpropagationtype algorithms.
Prediction via Orthogonalized Model Mixing
 JOURNAL OF THE AMERICAN STATISTICAL ASSOCIATION
, 1994
"... In this paper we introduce an approach and algorithms for model mixing in large prediction problems with correlated predictors. We focus on the choice of predictors in linear models, and mix over possible subsets of candidate predictors. Our approach is based on expressing the space of models in ter ..."
Abstract

Cited by 50 (9 self)
 Add to MetaCart
In this paper we introduce an approach and algorithms for model mixing in large prediction problems with correlated predictors. We focus on the choice of predictors in linear models, and mix over possible subsets of candidate predictors. Our approach is based on expressing the space of models in terms of an orthogonalization of the design matrix. Advantages are both statistical and computational. Statistically, orthogonalization often leads to a reduction in the number of competing models by eliminating correlations. Computationally, large model spaces cannot be enumerated; recent approaches are based on sampling models with high posterior probability via Markov chains. Based on orthogonalization of the space of candidate predictors, we can approximate the posterior probabilities of models by products of predictorspecific terms. This leads to an importance sampling function for sampling directly from the joint distribution over the model space, without resorting to Markov chains. Comp...
Choice Under Uncertainty with the Best and Worst in Mind: Neoadditive Capacities
 Journal of Economic Theory
, 2007
"... The concept of a nonextremeoutcomeadditive capacity(neoadditive capacity) is introduced. Neoadditive capacities model optimistic and pessimistic attitudes towards uncertainty as observed in many experimental studies. Moreover, neoadditive capacities can be applied easily in economic problems, ..."
Abstract

Cited by 35 (9 self)
 Add to MetaCart
The concept of a nonextremeoutcomeadditive capacity(neoadditive capacity) is introduced. Neoadditive capacities model optimistic and pessimistic attitudes towards uncertainty as observed in many experimental studies. Moreover, neoadditive capacities can be applied easily in economic problems, as we demonstrate by examples. This paper provides an axiomatisation of Choquet expected utility with neocapacities in a framework of purely subjective uncertainty. JEL Classification:
Updating Beliefs with Incomplete Observations
"... Currently, there is renewed interest in the problem, raised by Shafer in 1985, of updating probabilities when observations are incomplete (or setvalued). This is a fundamental problem in general, and of particular interest for Bayesian networks. Recently, Gr unwald and Halpern have shown that co ..."
Abstract

Cited by 32 (10 self)
 Add to MetaCart
Currently, there is renewed interest in the problem, raised by Shafer in 1985, of updating probabilities when observations are incomplete (or setvalued). This is a fundamental problem in general, and of particular interest for Bayesian networks. Recently, Gr unwald and Halpern have shown that commonly used updating strategies fail in this case, except under very special assumptions. In this paper we propose a new method for updating probabilities with incomplete observations. Our approach is deliberately conservative: we make no assumptions about the socalled incompleteness mechanism that associates complete with incomplete observations. We model our ignorance about this mechanism by a vacuous lower prevision, a tool from the theory of imprecise probabilities, and we use only coherence arguments to turn prior into posterior (updated) probabilities. In general, this new approach to updating produces lower and upper posterior probabilities and previsions (expectations), as well as partially determinate decisions. This is a logical consequence of the existing ignorance about the incompleteness mechanism. As an example, we use the new updating method to properly address the apparent paradox in the `Monty Hall' puzzle. More importantly, we apply it to the problem of classification of new evidence in probabilistic expert systems, where it leads to a new, socalled conservative updating rule.
The Adaptive Markets Hypothesis: Market Efficiency from an Evolutionary Perspective
 THE JOURNAL OF PORTFOLIO MANAGEMENT
, 2004
"... The 30th anniversary of The Journal of Portfolio Management is a milestone in the rich intellectual history of modern finance, firmly establishing the relevance of quantitative models and scientific inquiry in the practice of financial management. One of the most enduring ideas from this intellectu ..."
Abstract

Cited by 29 (9 self)
 Add to MetaCart
The 30th anniversary of The Journal of Portfolio Management is a milestone in the rich intellectual history of modern finance, firmly establishing the relevance of quantitative models and scientific inquiry in the practice of financial management. One of the most enduring ideas from this intellectual history is the Efficient Markets Hypothesis (EMH), a deceptively simple notion that has become a lightning rod for its disciples and the proponents of behavioral economics and finance. In its purest form, the EMH obviates active portfolio management, calling into question the very motivation for portfolio research. It is only fitting that we revisit this groundbreaking idea after three very successful decades of this Journal. In this article, I review the current state of the controversy surrounding the EMH and propose a new perspective that reconciles the two opposing schools of thought. The proposed reconciliation, which I call the Adaptive Markets Hypothesis (AMH), is based on an evolutionary approach to economic interactions, as well as some recent research in the cognitive neurosciences that has been transforming and revitalizing the intersection of psychology and economics. Although some of these ideas have not yet been fully articulated within a rigorous quantitative framework, long time students of the EMH and seasoned practitioners will no doubt recognize immediately the possibilities generated by this new perspective. Only time will tell whether its potential will be fulfilled. I begin with a brief review of the classic version of the EMH, and then summarize the most significant criticisms leveled against it by psychologists and behavioral economists. I argue that the sources of this controversy can
On exchangeable random variables and the statistics of large graphs and hypergraphs
, 2008
"... ..."
Inductive Inference: An Axiomatic Approach
 ECONOMETRICA
, 1999
"... A predictor is asked to rank eventualities according to their plausibility, based on past cases. We assume that she can form a ranking given any memory that consists of finitely many past cases. Mild consistency requirements on these rankings imply that they have a numerical representation via a mat ..."
Abstract

Cited by 20 (10 self)
 Add to MetaCart
A predictor is asked to rank eventualities according to their plausibility, based on past cases. We assume that she can form a ranking given any memory that consists of finitely many past cases. Mild consistency requirements on these rankings imply that they have a numerical representation via a matrix assigning numbers to eventualitycase pairs, as follows. Given a memory, each eventuality is ranked according to the sum of the numbers in its row, over cases in memory. The number attached to an eventualitycase pair can be interpreted as the degree of support that the past case lends to the plausibility of the eventuality. Special instances of this result may be viewed as axiomatizing kernel methods for estimation of densities and for classification problems. Interpreting the same result for rankings of theories or hypotheses, rather than of specific eventualities, it is shown that one may ascribe to the predictor subjective conditional probabilities of cases given theories, such that her rankings of theories agree with rankings by the likelihood functions.
Interpretations of probability
 In The Stanford Encyclopedia of Philosophy (Zalta
, 2003
"... “Probability is the very guide of life”, just as Bishop Butler wrote in 1736. Probability judgments of the efficacy and sideeffects of a pharmaceutical drug determine whether or not it is approved for release to the public. The fate of a defendant on trial for murder hinges on the jurors ’ opinions ..."
Abstract

Cited by 18 (2 self)
 Add to MetaCart
“Probability is the very guide of life”, just as Bishop Butler wrote in 1736. Probability judgments of the efficacy and sideeffects of a pharmaceutical drug determine whether or not it is approved for release to the public. The fate of a defendant on trial for murder hinges on the jurors ’ opinions about the probabilistic weight of evidence. Geologists calculate the probability that an earthquake of a certain intensity will hit a given city, and engineers accordingly build skyscrapers with specified probabilities of withstanding such earthquakes. Probability undergirds even measurement itself, since the error bounds that accompany measurements are essentially probabilistic confidence intervals. We find probability wherever we find uncertainty—that is, almost everywhere in our lives. It is surprising, then, that probability arrived comparatively late on the intellectual scene. To be sure, a notion of randomness was known to the ancients—Epicurus, and later Lucretius, believed that atoms occasionally underwent indeterministic swerves. Averroes had a notion of ‘equipotency ’ that might be regarded as a precursor to probabilistic notions. But probability theory was not conceived until the 17 th century, with the FermatPascal correspondence and the PortRoyal Logic. Over the next three centuries, the theory was developed by such authors as Huygens, Bernoulli, Bayes, Laplace, Condorcet, de Moivre, Venn, Johnson, and Keynes. Arguably, the crowning achievement was Kolmogorov’s axiomatization in 1933, which put probability on rigorous mathematical footing. The formal theory of probability In Kolmogorov’s theory, probabilities are numerical values that are assigned to ‘events’. The numbers are nonnegative; they have a maximum value of 1; and the probability that one of two mutually exclusive events occurs is the sum of their individual probabilities. Said more formally: given a set Ω and a privileged set of subsets F of Ω, probability is a function P from F to the real numbers that obeys, for all X and Y in F: [Editor: Put in a BOX entitled ‘Kolmogorov’s probability axioms’?:]