Results 1  10
of
331
Partial Constraint Satisfaction
, 1992
"... . A constraint satisfaction problem involves finding values for variables subject to constraints on which combinations of values are allowed. In some cases it may be impossible or impractical to solve these problems completely. We may seek to partially solve the problem, in particular by satisfying ..."
Abstract

Cited by 443 (23 self)
 Add to MetaCart
(Show Context)
. A constraint satisfaction problem involves finding values for variables subject to constraints on which combinations of values are allowed. In some cases it may be impossible or impractical to solve these problems completely. We may seek to partially solve the problem, in particular by satisfying a maximal number of constraints. Standard backtracking and local consistency techniques for solving constraint satisfaction problems can be adapted to cope with, and take advantage of, the differences between partial and complete constraint satisfaction. Extensive experimentation on maximal satisfaction problems illuminates the relative and absolute effectiveness of these methods. A general model of partial constraint satisfaction is proposed. 1 Introduction Constraint satisfaction involves finding values for problem variables subject to constraints on acceptable combinations of values. Constraint satisfaction has wide application in artificial intelligence, in areas ranging from temporal r...
How to improve Bayesian reasoning without instruction: Frequency formats
 Psychological Review
, 1995
"... Is the mind, by design, predisposed against performing Bayesian inference? Previous research on base rate neglect suggests that the mind lacks the appropriate cognitive algorithms. However, any claim against the existence of an algorithm, Bayesian or otherwise, is impossible to evaluate unless one s ..."
Abstract

Cited by 284 (22 self)
 Add to MetaCart
Is the mind, by design, predisposed against performing Bayesian inference? Previous research on base rate neglect suggests that the mind lacks the appropriate cognitive algorithms. However, any claim against the existence of an algorithm, Bayesian or otherwise, is impossible to evaluate unless one specifies the information format in which it is designed to operate. The authors show that Bayesian algorithms are computationally simpler in frequency formats than in the probability formats used in previous research. Frequency formats correspond to the sequential way information is acquired in natural sampling, from animal foraging to neural networks. By analyzing several thousand solutions to Bayesian problems, the authors found that when information was presented in frequency formats, statistically naive participants derived up to 50 % of all inferences by Bayesian algorithms. NonBayesian algorithms included simple versions of Fisherian and NeymanPearsonian inference. Is the mind, by design, predisposed against performing Bayesian inference? The classical probabilists of the Enlightenment, including Condorcet, Poisson, and Laplace, equated probability theory with the common sense of educated people, who were known then as “hommes éclairés.” Laplace (1814/1951) declared that “the theory of probability is at bottom nothing more than good sense reduced to a calculus which evaluates that which good minds know by a sort of instinct,
Operations for Learning with Graphical Models
 Journal of Artificial Intelligence Research
, 1994
"... This paper is a multidisciplinary review of empirical, statistical learning from a graphical model perspective. Wellknown examples of graphical models include Bayesian networks, directed graphs representing a Markov chain, and undirected networks representing a Markov field. These graphical models ..."
Abstract

Cited by 253 (12 self)
 Add to MetaCart
This paper is a multidisciplinary review of empirical, statistical learning from a graphical model perspective. Wellknown examples of graphical models include Bayesian networks, directed graphs representing a Markov chain, and undirected networks representing a Markov field. These graphical models are extended to model data analysis and empirical learning using the notation of plates. Graphical operations for simplifying and manipulating a problem are provided including decomposition, differentiation, and the manipulation of probability models from the exponential family. Two standard algorithm schemas for learning are reviewed in a graphical framework: Gibbs sampling and the expectation maximization algorithm. Using these operations and schemas, some popular algorithms can be synthesized from their graphical specification. This includes versions of linear regression, techniques for feedforward networks, and learning Gaussian and discrete Bayesian networks from data. The paper conclu...
The weighting of evidence and the determinants of confidence
 Cognitive Psychology
, 1992
"... The pattern of overconfidence and underconfidence observed in studies of intuitive judgment is explained by the hypothesis that people focus on the strength or extremeness of the available evidence (e.g., the warmth of a letter or the size of an effect) with insufficient regard for its weight or cr ..."
Abstract

Cited by 167 (2 self)
 Add to MetaCart
The pattern of overconfidence and underconfidence observed in studies of intuitive judgment is explained by the hypothesis that people focus on the strength or extremeness of the available evidence (e.g., the warmth of a letter or the size of an effect) with insufficient regard for its weight or credence (e.g., the credibility of the writer or the size of the sample). This mode of judgment yields overconfidence when strength is high and weight is low, and underconfidence when strength is low and weight is high. We first demonstrate this phenomenon in a chance setup where strength is defined by sample proportion and weight is defined by sample size, and then extend the analysis to more complex evidential problems, including general knowledge questions and predicting the behavior of self and of others. We propose that people’s confidence is determined by the balance of arguments for and against the competing hypotheses, with insufficient regard for the weight of the evidence. We show that this account can explain the effect of item difficulty on overconfidence, and we relate the observed discrepancy between confidence judgments and frequency estimates to the illusion of validity.
Numerical Uncertainty Management in User and Student Modeling: An Overview of Systems and Issues
, 1996
"... . A rapidly growing number of user and student modeling systems have employed numerical techniques for uncertainty management. The three major paradigms are those of Bayesian networks, the DempsterShafer theory of evidence, and fuzzy logic. In this overview, each of the first three main sections fo ..."
Abstract

Cited by 111 (10 self)
 Add to MetaCart
. A rapidly growing number of user and student modeling systems have employed numerical techniques for uncertainty management. The three major paradigms are those of Bayesian networks, the DempsterShafer theory of evidence, and fuzzy logic. In this overview, each of the first three main sections focuses on one of these paradigms. It first introduces the basic concepts by showing how they can be applied to a relatively simple user modeling problem. It then surveys systems that have applied techniques from the paradigm to user or student modeling, characterizing each system within a common framework. The final main section discusses several aspects of the usability of these techniques for user and student modeling, such as their knowledge engineering requirements, their need for computational resources, and the communicability of their results. Key words: numerical uncertainty management, Bayesian networks, DempsterShafer theory, fuzzy logic, user modeling, student modeling 1. Introdu...
Decision Theory in Expert Systems and Artificial Intelligence
 International Journal of Approximate Reasoning
, 1988
"... Despite their different perspectives, artificial intelligence (AI) and the disciplines of decision science have common roots and strive for similar goals. This paper surveys the potential for addressing problems in representation, inference, knowledge engineering, and explanation within the decision ..."
Abstract

Cited by 95 (18 self)
 Add to MetaCart
(Show Context)
Despite their different perspectives, artificial intelligence (AI) and the disciplines of decision science have common roots and strive for similar goals. This paper surveys the potential for addressing problems in representation, inference, knowledge engineering, and explanation within the decisiontheoretic framework. Recent analyses of the restrictions of several traditional AI reasoning techniques, coupled with the development of more tractable and expressive decisiontheoretic representation and inference strategies, have stimulated renewed interest in decision theory and decision analysis. We describe early experience with simple probabilistic schemes for automated reasoning, review the dominant expertsystem paradigm, and survey some recent research at the crossroads of AI and decision science. In particular, we present the belief network and influence diagram representations. Finally, we discuss issues that have not been studied in detail within the expertsystems sett...
Hierarchical Constraint Logic Programming
, 1993
"... A constraint describes a relation to be maintained ..."
Abstract

Cited by 69 (3 self)
 Add to MetaCart
A constraint describes a relation to be maintained
Preference Ratios in Multiattribute Evaluation (PRIME)  Elicitation and Decision Procedures Under Incomplete Information
, 2001
"... This paper presents the preference ratios in multiattribute evaluation (PRIME) method which supports the analysis of incomplete information in multiattribute weighting models. In PRIME, preference elicitation and synthesis is based on 1) the conversion of possibly imprecise ratio judgments into an i ..."
Abstract

Cited by 67 (23 self)
 Add to MetaCart
This paper presents the preference ratios in multiattribute evaluation (PRIME) method which supports the analysis of incomplete information in multiattribute weighting models. In PRIME, preference elicitation and synthesis is based on 1) the conversion of possibly imprecise ratio judgments into an imprecisely specified preference model, 2) the use of dominance structures and decision rules in deriving decision recommendations, and 3) the sequencing of the elicitation process into a series of elicitation tasks. This process may be continued until the most preferred alternative is identified or, alternatively, stopped with a decision recommendation if the decision maker is prepared to accept the possibility that the value of some other alternative is higher. An extensive simulation study on the computational properties of PRIME is presented. The method is illustrated with a reanalysis of an earlier case study on international oil tanker negotiations.
The Role of Aspiration Level in Risky Choice: A Comparison of Cumulative Prospect Theory and SP/A Theory
 Journal of Mathematical Psychology
, 1999
"... In recent years, descriptive models of risky choice have incorporated features that reflect the importance of particular outcome values in choice. Cumulative prospect theory (CPT) does this by inserting a reference point in the utility function. SP/A (securitypotential/aspiration) theory uses aspir ..."
Abstract

Cited by 64 (0 self)
 Add to MetaCart
In recent years, descriptive models of risky choice have incorporated features that reflect the importance of particular outcome values in choice. Cumulative prospect theory (CPT) does this by inserting a reference point in the utility function. SP/A (securitypotential/aspiration) theory uses aspiration level as a second criterion in the choice process. Experiment 1 compares the ability of the CPT and SP/A models to account for the same withinsubjects data set and finds in favor of SP/A. Experiment 2 replicates the main finding of Experiment 1 in a betweensubjects design. The final discussion brackets the SP/A result by showing the impact on fit of both decreasing and increasing the number of free
Adaptive provision of evaluationoriented information: Tasks and techniques
 PROCEEDINGS OF THE FOURTEENTH INTERNATIONAL JOINT CONFERENCE ON ARTIFICIAL INTELLIGENCE
, 1995
"... Evaluationoriented information provision is a function performed by many systems that serve as personal assistants, advisors, or sales assistants. Five general tasks are distinguished which need to be addressed by such systems. For each task, techniques employed in a sample of systems are discussed ..."
Abstract

Cited by 62 (9 self)
 Add to MetaCart
Evaluationoriented information provision is a function performed by many systems that serve as personal assistants, advisors, or sales assistants. Five general tasks are distinguished which need to be addressed by such systems. For each task, techniques employed in a sample of systems are discussed, and it is shown how the lessons learned from these systems can be taken into account with a set of unified techniques that make use of wellunderstood concepts and principles from MultiAttribute Utility Theory and Bayesian networks. These techniques are illustrated as realized in the dialog system PRACMA.