Results 1  10
of
25
Numerical Uncertainty Management in User and Student Modeling: An Overview of Systems and Issues
, 1996
"... . A rapidly growing number of user and student modeling systems have employed numerical techniques for uncertainty management. The three major paradigms are those of Bayesian networks, the DempsterShafer theory of evidence, and fuzzy logic. In this overview, each of the first three main sections fo ..."
Abstract

Cited by 105 (10 self)
 Add to MetaCart
. A rapidly growing number of user and student modeling systems have employed numerical techniques for uncertainty management. The three major paradigms are those of Bayesian networks, the DempsterShafer theory of evidence, and fuzzy logic. In this overview, each of the first three main sections focuses on one of these paradigms. It first introduces the basic concepts by showing how they can be applied to a relatively simple user modeling problem. It then surveys systems that have applied techniques from the paradigm to user or student modeling, characterizing each system within a common framework. The final main section discusses several aspects of the usability of these techniques for user and student modeling, such as their knowledge engineering requirements, their need for computational resources, and the communicability of their results. Key words: numerical uncertainty management, Bayesian networks, DempsterShafer theory, fuzzy logic, user modeling, student modeling 1. Introdu...
Perspectives on the Theory and Practice of Belief Functions
 International Journal of Approximate Reasoning
, 1990
"... The theory of belief functions provides one way to use mathematical probability in subjective judgment. It is a generalization of the Bayesian theory of subjective probability. When we use the Bayesian theory to quantify judgments about a question, we must assign probabilities to the possible answer ..."
Abstract

Cited by 85 (7 self)
 Add to MetaCart
The theory of belief functions provides one way to use mathematical probability in subjective judgment. It is a generalization of the Bayesian theory of subjective probability. When we use the Bayesian theory to quantify judgments about a question, we must assign probabilities to the possible answers to that question. The theory of belief functions is more flexible; it allows us to derive degrees of belief for a question from probabilities for a related question. These degrees of belief may or may not have the mathematical properties of probabilities; how much they differ from probabilities will depend on how closely the two questions are related. Examples of what we would now call belieffunction reasoning can be found in the late seventeenth and early eighteenth centuries, well before Bayesian ideas were developed. In 1689, George Hooper gave rules for combining testimony that can be recognized as special cases of Dempster's rule for combining belief functions (Shafer 1986a). Similar rules were formulated by Jakob Bernoulli in his Ars Conjectandi, published posthumously in 1713, and by JohannHeinrich Lambert in his Neues Organon, published in 1764 (Shafer 1978). Examples of belieffunction reasoning can also be found in more recent work, by authors
Two views of belief: Belief as generalized probability and belief as evidence
, 1992
"... : Belief functions are mathematical objects defined to satisfy three axioms that look somewhat similar to the Kolmogorov axioms defining probability functions. We argue that there are (at least) two useful and quite different ways of understanding belief functions. The first is as a generalized prob ..."
Abstract

Cited by 72 (12 self)
 Add to MetaCart
: Belief functions are mathematical objects defined to satisfy three axioms that look somewhat similar to the Kolmogorov axioms defining probability functions. We argue that there are (at least) two useful and quite different ways of understanding belief functions. The first is as a generalized probability function (which technically corresponds to the inner measure induced by a probability function). The second is as a way of representing evidence. Evidence, in turn, can be understood as a mapping from probability functions to probability functions. It makes sense to think of updating a belief if we think of it as a generalized probability. On the other hand, it makes sense to combine two beliefs (using, say, Dempster's rule of combination) only if we think of the belief functions as representing evidence. Many previous papers have pointed out problems with the belief function approach; the claim of this paper is that these problems can be explained as a consequence of confounding the...
Children and adults as intuitive scientists
 Psychological Review
, 1989
"... The metaphor of children and lay adults as intuitive scientists has gained wide acceptance. Although useful in one sense, pertaining to scientific understanding, in another, pertaining to the process of scientific thinking, the metaphor may be fundamentally misleading. Research is reviewed indicatin ..."
Abstract

Cited by 70 (0 self)
 Add to MetaCart
The metaphor of children and lay adults as intuitive scientists has gained wide acceptance. Although useful in one sense, pertaining to scientific understanding, in another, pertaining to the process of scientific thinking, the metaphor may be fundamentally misleading. Research is reviewed indicating that processes of scientific thinking differ significantly in children, lay adults, and scientists. Hence, it is the instruments of scientific thinking, not just the products, that undergo "strong restructuring" (Carey, 1986). A framework for conceptualizing development of scientific thinking processes is proposed, centering on progressive differentiation and coordination of theory and evidence. This development is metacognitive, as well as strategic. It requires thinking about theories, rather than merely with them, and thinking about evidence, rather than merely being influenced by it, and, hence, reflects the attainment of control over the interaction of theories and evidence in one's own thinking. The metaphor of the lay adult—or the child—as an intuitive scientist has gained wide acceptance in the last decade. As the scientist explores the environment, constructs models as a basis for understanding it, and revises those models as new evidence is generated, so do lay people endeavor to make sense of their
Decision Making with Belief Functions: Compatibility and Incompatibility with the SureThing Principle
 JOURNAL OF RISK AND UNCERTAINTY, 8:255271 (1994) 9 1994
, 1994
"... This article studies situations in which information is ambiguous and only part of it can be probabilized. It is shown that the information can be modeled through belief functions if and only if the nonprobabilizable information is subject to the principles of complete ignorance. Next the representa ..."
Abstract

Cited by 21 (1 self)
 Add to MetaCart
This article studies situations in which information is ambiguous and only part of it can be probabilized. It is shown that the information can be modeled through belief functions if and only if the nonprobabilizable information is subject to the principles of complete ignorance. Next the representability of decisions by belief functions on outcomes is justified by means of a neutrality axiom. The natural weakening of Savage's surething principle to unambiguous events is examined and its implications for decision making are identified.
A defect in DempsterShafer theory
 InProceedings of the Tenth Conference on Uncertainty in Arti cial Intelligence
, 1994
"... By analyzing the relationships among chance, weight of evidence and degree ofbelief, it is shown that the assertion \chances are special cases of belief functions " and the assertion \Dempster's rule can be used to combine belief functions based on distinct bodies of evidence " together lead to an i ..."
Abstract

Cited by 15 (12 self)
 Add to MetaCart
By analyzing the relationships among chance, weight of evidence and degree ofbelief, it is shown that the assertion \chances are special cases of belief functions " and the assertion \Dempster's rule can be used to combine belief functions based on distinct bodies of evidence " together lead to an inconsistency in DempsterShafer theory. To solve this problem, some fundamental postulates of the theory must be rejected. A new approach for uncertainty management is introduced, which shares many intuitive ideas with DS theory, while avoiding this problem. 1
A Logic for Default Reasoning About Probabilities
, 1998
"... A logic is defined that allows to express information about statistical probabilities and about degrees of belief in specific propositions. By interpreting the two types of probabilities in one common probability space, the semantics given are well suited to model the in uence of statistical informa ..."
Abstract

Cited by 12 (4 self)
 Add to MetaCart
A logic is defined that allows to express information about statistical probabilities and about degrees of belief in specific propositions. By interpreting the two types of probabilities in one common probability space, the semantics given are well suited to model the in uence of statistical information on the formation of subjective beliefs. Cross entropy minimization is a key element in these semantics, the use of which is justified by showing that the resulting logic exhibits some very reasonable properties.
Critical Decisions under Uncertainty: Representation and Structure
, 1988
"... How do people make difficult decisions in situations involving substantial risk and uncertainty? In this study, we presented a difficult medical decision to three expert physicians in a combined "thinking aloud" and "cross examination" experiment. Verbatim transcripts were analyzed using script anal ..."
Abstract

Cited by 6 (0 self)
 Add to MetaCart
How do people make difficult decisions in situations involving substantial risk and uncertainty? In this study, we presented a difficult medical decision to three expert physicians in a combined "thinking aloud" and "cross examination" experiment. Verbatim transcripts were analyzed using script analysis to observe the process of constructing and making the decision, and using referring phrase analysis to determine the representation of knowledge of likelihoods. These analyses are compared with a formal decision analysis of the same problem to highlight similarities and differences. The process of making the decision resembles an incremental, sequentialrefinement planning algorithm, where a complex decision is broken into a sequence of choices to be made with a simplified description of the alternatives. This strategy results in certain kinds of relevant information being underweighted in the final decision. Knowledge of likelihood appears to be represented as symbolic descriptions c...