Results 1  10
of
63
Reasoning about beliefs and actions under computational resource constraints
 in Proceedings of the 1989 Workshop on Uncertainty and AI
, 1987
"... Although many investigators arm a desire to build reasoning systems that behave consistently with the axiomatic basis dened by probability theory and utility theory, limited resources for engineering and computation can make a complete normative analysis impossible. We attempt to move discussion be ..."
Abstract

Cited by 205 (21 self)
 Add to MetaCart
(Show Context)
Although many investigators arm a desire to build reasoning systems that behave consistently with the axiomatic basis dened by probability theory and utility theory, limited resources for engineering and computation can make a complete normative analysis impossible. We attempt to move discussion beyond the debate over the scope of problems that can be handled eectively to cases where it is clear that there are insucient computational resources to perform an analysis deemed as complete. Under these conditions, we stress the importance of considering the expected costs and benets of applying alternative approximation procedures and heuristics for computation and knowledge acquisition. We discuss how knowledge about the structure of user utility can be used to control value tradeos for tailoring inference to alternative contexts. We address the notion of realtime rationality, focusing on the application of knowledge about the expected timewiserenement abilities of reasoning strategies to balance the bene ts of additional computation with the costs of acting with a partial result. We discuss the benets of applying decision theory to control the solution of dicult problems given limitations and uncertainty in reasoning resources. 1
Decision Theory in Expert Systems and Artificial Intelligence
 International Journal of Approximate Reasoning
, 1988
"... Despite their different perspectives, artificial intelligence (AI) and the disciplines of decision science have common roots and strive for similar goals. This paper surveys the potential for addressing problems in representation, inference, knowledge engineering, and explanation within the decision ..."
Abstract

Cited by 100 (19 self)
 Add to MetaCart
(Show Context)
Despite their different perspectives, artificial intelligence (AI) and the disciplines of decision science have common roots and strive for similar goals. This paper surveys the potential for addressing problems in representation, inference, knowledge engineering, and explanation within the decisiontheoretic framework. Recent analyses of the restrictions of several traditional AI reasoning techniques, coupled with the development of more tractable and expressive decisiontheoretic representation and inference strategies, have stimulated renewed interest in decision theory and decision analysis. We describe early experience with simple probabilistic schemes for automated reasoning, review the dominant expertsystem paradigm, and survey some recent research at the crossroads of AI and decision science. In particular, we present the belief network and influence diagram representations. Finally, we discuss issues that have not been studied in detail within the expertsystems sett...
Toward normative expert systems: Part I the pathfinder project
 Methods of Information in Medicine, 31:90– 105
, 1992
"... ..."
(Show Context)
From Laplace To Supernova Sn 1987a: Bayesian Inference In Astrophysics
, 1990
"... . The Bayesian approach to probability theory is presented as an alternative to the currently used longrun relative frequency approach, which does not offer clear, compelling criteria for the design of statistical methods. Bayesian probability theory offers unique and demonstrably optimal solutions ..."
Abstract

Cited by 61 (2 self)
 Add to MetaCart
. The Bayesian approach to probability theory is presented as an alternative to the currently used longrun relative frequency approach, which does not offer clear, compelling criteria for the design of statistical methods. Bayesian probability theory offers unique and demonstrably optimal solutions to wellposed statistical problems, and is historically the original approach to statistics. The reasons for earlier rejection of Bayesian methods are discussed, and it is noted that the work of Cox, Jaynes, and others answers earlier objections, giving Bayesian inference a firm logical and mathematical foundation as the correct mathematical language for quantifying uncertainty. The Bayesian approaches to parameter estimation and model comparison are outlined and illustrated by application to a simple problem based on the gaussian distribution. As further illustrations of the Bayesian paradigm, Bayesian solutions to two interesting astrophysical problems are outlined: the measurement of wea...
Imprecision in Engineering Design
 ASME JOURNAL OF MECHANICAL DESIGN
, 1995
"... Methods for incorporating imprecision in engineering design decisionmaking are briefly reviewed and compared. A tutorial is presented on the Method of Imprecision (MoI), a formal method, based on the mathematics of fuzzy sets, for representing and manipulating imprecision in engineering design. The ..."
Abstract

Cited by 56 (6 self)
 Add to MetaCart
(Show Context)
Methods for incorporating imprecision in engineering design decisionmaking are briefly reviewed and compared. A tutorial is presented on the Method of Imprecision (MoI), a formal method, based on the mathematics of fuzzy sets, for representing and manipulating imprecision in engineering design. The results of a design cost estimation example, utilizing a new informal cost specification, are presented. The MoI can provide formal information upon which to base decisions during preliminary engineering design and can facilitate setbased concurrent design.
A decision theoretic framework for approximating concepts
 International Journal of Manmachine Studies
, 1992
"... This paper explores the implications of approximating a concept based on the Bayesian decision procedure, which provides a plausible unification of the fuzzy set and rough set approaches for approximating a concept. We show that if a given concept is approximated by one set, the same result given by ..."
Abstract

Cited by 41 (22 self)
 Add to MetaCart
This paper explores the implications of approximating a concept based on the Bayesian decision procedure, which provides a plausible unification of the fuzzy set and rough set approaches for approximating a concept. We show that if a given concept is approximated by one set, the same result given by the αcut in the fuzzy set theory is obtained. On the other hand, if a given concept is approximated by two sets, we can derive both the algebraic and probabilistic rough set approximations. Moreover, based on the well known principle of maximum (minimum) entropy, we give a useful interpretation of fuzzy intersection and union. Our results enhance the understanding and broaden the applications of both fuzzy and rough sets. 1.
Aggregation Functions for Engineering Design Tradeoffs
, 1998
"... The choice of an aggregation function is a common problem in Multi Attribute Decision Making (MADM) systems. The Method of Imprecision (MoI) is a formal theory for the manipulation of preliminary design information that represents preferences among design alternatives with the mathematics of fuzzy s ..."
Abstract

Cited by 33 (14 self)
 Add to MetaCart
(Show Context)
The choice of an aggregation function is a common problem in Multi Attribute Decision Making (MADM) systems. The Method of Imprecision (MoI) is a formal theory for the manipulation of preliminary design information that represents preferences among design alternatives with the mathematics of fuzzy sets. The MoI formulates the preliminary design problem as a MADM problem. To date, two aggregation functions have been developed for the MoI, one representing a compensating strategy and one a noncompensating strategy. Much of the prior fuzzy sets research on aggregation functions has been inappropriate for application to engineering design. In this paper, the selection of an aggregation function for MADM schemes is discussed within the context of the MoI. The general restrictions on designappropriate aggregation functions are outlined, and a family of functions, modeling a range of tradeoff strategies, is presented. The results are illustrated with an example.
FORMALISMS FOR NEGOTIATION IN ENGINEERING DESIGN
, 1996
"... Engineering projects often undergo several design iterations before being completed. Information received from other groups working on a project (analysis, manufacturing, marketing, sales) will often necessitate changes in a design. The interaction between different groups associated with a design p ..."
Abstract

Cited by 26 (5 self)
 Add to MetaCart
Engineering projects often undergo several design iterations before being completed. Information received from other groups working on a project (analysis, manufacturing, marketing, sales) will often necessitate changes in a design. The interaction between different groups associated with a design project often takes the form of informal “negotiation. ” This form of interaction commonly arises when engineering information is imprecise. The Method of Imprecision (MoI) is a formal method for the representation and manipulation of preliminary and imprecise design information. It provides a mechanism for the formalization of these informal negotiations. The nature and scope of informal negotiation in engineering is explored and discussed, and application of the MoI is illustrated with an example.
The method of imprecision compared to utility theory for design selection problems
 In Proceedings of the 1993 ASME Design Theory and Methodology Conference
, 1993
"... Two methods have been proposed for manipulating uncertainty reflecting designer choice: utility theory and the method of imprecision. Both methods represent this uncertainty across decision making attributes with zero to one ranks, higher preference modeled with a higher rank. The two methods can di ..."
Abstract

Cited by 24 (5 self)
 Add to MetaCart
(Show Context)
Two methods have been proposed for manipulating uncertainty reflecting designer choice: utility theory and the method of imprecision. Both methods represent this uncertainty across decision making attributes with zero to one ranks, higher preference modeled with a higher rank. The two methods can differ, however, in the combination metrics used to combine the ranks of the incommensurate design attributes. Utility theory resolves the multiattributes using various well proven additive metrics. In contrast, the method of imprecision resolves by also considering nonadditive metrics, such as ranking by the worst case performance or by multiplicative metrics. The axioms of utility theory are appropriate for designs where it is deemed the attributes can always be traded off, even to the point of achieving zero preference in some attributes. In the case of a design with attributes which cannot have zero preference, such as stress limits or maximum allowed cost, the method of imprecision is more appropriate: it trades off attribute levels without permitting any of them to be traded off to zero performance. 1
Bayesian Regularisation and Pruning using a Laplace Prior
 Neural Computation
, 1994
"... Standard techniques for improved generalisation from neural networks include weight decay and pruning. Weight decay has a Bayesian interpretation with the decay function corresponding to a prior over weights. The method of transformation groups and maximum entropy indicates a Laplace rather than a G ..."
Abstract

Cited by 23 (0 self)
 Add to MetaCart
Standard techniques for improved generalisation from neural networks include weight decay and pruning. Weight decay has a Bayesian interpretation with the decay function corresponding to a prior over weights. The method of transformation groups and maximum entropy indicates a Laplace rather than a Gaussian prior. After training, the weights then arrange themselves into two classes: (1) those with a common sensitivity to the data error (2) those failing to achieve this sensitivity and which therefore vanish. Since the critical value is determined adaptively during training, pruningin the sense of setting weights to exact zerosbecomes a consequence of regularisation alone. The count of free parameters is also reduced automatically as weights are pruned. A comparison is made with results of MacKay using the evidence framework and a Gaussian regulariser. 1 Introduction Neural networks designed for regression or classification need to be trained using some form of stabilisation or re...