Results 1  10
of
508
Prospect theory: An analysis of decisions under risk
 Econometrica
, 1979
"... Your use of the JSTOR archive indicates your acceptance of JSTOR's Terms and Conditions of Use, available at ..."
Abstract

Cited by 5036 (24 self)
 Add to MetaCart
(Show Context)
Your use of the JSTOR archive indicates your acceptance of JSTOR's Terms and Conditions of Use, available at
Learning Bayesian networks: The combination of knowledge and statistical data
 Machine Learning
, 1995
"... We describe scoring metrics for learning Bayesian networks from a combination of user knowledge and statistical data. We identify two important properties of metrics, which we call event equivalence and parameter modularity. These properties have been mostly ignored, but when combined, greatly simpl ..."
Abstract

Cited by 1090 (36 self)
 Add to MetaCart
(Show Context)
We describe scoring metrics for learning Bayesian networks from a combination of user knowledge and statistical data. We identify two important properties of metrics, which we call event equivalence and parameter modularity. These properties have been mostly ignored, but when combined, greatly simplify the encoding of a user’s prior knowledge. In particular, a user can express his knowledge—for the most part—as a single prior Bayesian network for the domain. 1
The Transferable Belief Model
 ARTIFICIAL INTELLIGENCE
, 1994
"... We describe the transferable belief model, a model for representing quantified beliefs based on belief functions. Beliefs can be held at two levels: (1) a credal level where beliefs are entertained and quantified by belief functions, (2) a pignistic level where beliefs can be used to make decisions ..."
Abstract

Cited by 461 (14 self)
 Add to MetaCart
We describe the transferable belief model, a model for representing quantified beliefs based on belief functions. Beliefs can be held at two levels: (1) a credal level where beliefs are entertained and quantified by belief functions, (2) a pignistic level where beliefs can be used to make decisions and are quantified by probability functions. The relation between the belief function and the probability function when decisions must be made is derived and justified. Four paradigms are analyzed in order to compare Bayesian, upper and lower probability, and the transferable belief approaches.
A Tutorial on Learning Bayesian Networks
 Communications of the ACM
, 1995
"... We examine a graphical representation of uncertain knowledge called a Bayesian network. The representation is easy to construct and interpret, yet has formal probabilistic semantics making it suitable for statistical manipulation. We show how we can use the representation to learn new knowledge by c ..."
Abstract

Cited by 351 (13 self)
 Add to MetaCart
We examine a graphical representation of uncertain knowledge called a Bayesian network. The representation is easy to construct and interpret, yet has formal probabilistic semantics making it suitable for statistical manipulation. We show how we can use the representation to learn new knowledge by combining domain knowledge with statistical data. 1 Introduction Many techniques for learning rely heavily on data. In contrast, the knowledge encoded in expert systems usually comes solely from an expert. In this paper, we examine a knowledge representation, called a Bayesian network, that lets us have the best of both worlds. Namely, the representation allows us to learn new knowledge by combining expert domain knowledge and statistical data. A Bayesian network is a graphical representation of uncertain knowledge that most people find easy to construct and interpret. In addition, the representation has formal probabilistic semantics, making it suitable for statistical manipulation (Howard,...
Fourier descriptors for plane closed curves
 Computers, IEEE Transactions on C21
, 1972
"... ..."
(Show Context)
Inequality decomposition by factor components
 Econometrica
, 1982
"... you have obtained prior permission, you may not download an entire issue of a journal or multiple copies of articles, and you may use content in the JSTOR archive only for your personal, noncommercial use. Please contact the publisher regarding any further use of this work. Publisher contact inform ..."
Abstract

Cited by 90 (1 self)
 Add to MetaCart
you have obtained prior permission, you may not download an entire issue of a journal or multiple copies of articles, and you may use content in the JSTOR archive only for your personal, noncommercial use. Please contact the publisher regarding any further use of this work. Publisher contact information may be obtained at
Comparametric Equations with Practical Applications in Quantigraphic Image Processing
, 2000
"... It is argued that, hidden within the flow of signals from typical cameras, through image processing, to display media, is a homomorphic filter. While homomorphic filtering is often desirable, there are some occasions where it is not. Thus, cancellation of this implicit homomorphic filter is proposed ..."
Abstract

Cited by 75 (10 self)
 Add to MetaCart
It is argued that, hidden within the flow of signals from typical cameras, through image processing, to display media, is a homomorphic filter. While homomorphic filtering is often desirable, there are some occasions where it is not. Thus, cancellation of this implicit homomorphic filter is proposed, through the introduction of an antihomomorphic filter. This concept gives rise to the principle of quantigraphic image processing, wherein it is argued that most cameras can be modeled as an array of idealized light meters each linearly responsive to a semimonotonic function of the quantity of light received, integrated over a fixed spectral response profile. This quantity is neither radiometric nor photometric, but, rather, depends only on the spectral response of the sensor elements in the camera. A particular class of functional equations, called comparametric equations, is introduced as a basis for quantigraphic image processing. Comparametric equations are fundamental to the analysis and processing of multiple images differing only in exposure. The wellknown "gamma correction" of an image is presented as a simple example of a comparametric equation, for which it is shown that the underlying quantigraphic function does not pass through the origin. For this reason it is argued that exposure adjustment by gamma correction is inherently flawed, and alternatives are provided. These alternatives, when applied to a plurality of images that differ only in exposure, give rise to a new kind of processing in the "amplitude domain" (as opposed to the time domain or the frequency domain). While the theoretical framework presented in this paper originated within the field of wearable cybernetics (wearable photographic apparatus) in the 1970s and early 1980s, it is applicable to th...
Some issues on consistency of fuzzy preference relations
 European Journal of Operational Research
, 2004
"... In decision making, in order to avoid misleading solutions, the study of consistency when the decision makers express their opinions by means of preference relations becomes a very important aspect in order to avoid misleading solutions. In decision making problems based on fuzzy preference relation ..."
Abstract

Cited by 66 (29 self)
 Add to MetaCart
(Show Context)
In decision making, in order to avoid misleading solutions, the study of consistency when the decision makers express their opinions by means of preference relations becomes a very important aspect in order to avoid misleading solutions. In decision making problems based on fuzzy preference relations the study of consistency is associated with the study of the transitivity property. In this paper, a new characterization of the consistency property defined by the additive transitivity property of the fuzzy preference relations is presented. Using this new characterization a method for constructing consistent fuzzy preference relations from a set of n1 preference data is proposed. Applying this method it is possible to assure better consistency of the fuzzy preference relations provided by the decision makers, and in such a way, to avoid the inconsistent solutions in the decision making processes. Additionally, a similar study of consistency is developed for the case of multiplicative preference relations.
Multidimensional poverty indices
 Social Choice and Welfare
, 2002
"... Abstract. This paper explores the axiomatic foundation of multidimensional poverty indices. Departing from the income approach which measures poverty by aggregating shortfalls of incomes from a predetermined povertyline income, a multidimensional index is a numerical representation of shortfalls o ..."
Abstract

Cited by 63 (0 self)
 Add to MetaCart
(Show Context)
Abstract. This paper explores the axiomatic foundation of multidimensional poverty indices. Departing from the income approach which measures poverty by aggregating shortfalls of incomes from a predetermined povertyline income, a multidimensional index is a numerical representation of shortfalls of basic needs from some prespeci®ed minimum levels. The class of subgroup consistent poverty indices introduced by Foster and Shorrocks �1991) is generalized to the multidimensional context. New concepts necessary for the design of distributionsensitive multidimensional poverty measures are introduced. Speci®c classes of subgroup consistent multidimensional poverty measures are derived based on sets of reasonable axioms. This paper also highlights the fact that domain restrictions may have a critical role in the design of multidimensional indices. 1