Results 1  10
of
20
Probabilistic argumentation systems
 Handbook of Defeasible Reasoning and Uncertainty Management Systems, Volume 5: Algorithms for Uncertainty and Defeasible Reasoning
, 2000
"... Different formalisms for solving problems of inference under uncertainty have been developed so far. The most popular numerical approach is the theory of Bayesian inference [42]. More general approaches are the DempsterShafer theory of evidence [51], and possibility theory [16], which is closely re ..."
Abstract

Cited by 64 (35 self)
 Add to MetaCart
(Show Context)
Different formalisms for solving problems of inference under uncertainty have been developed so far. The most popular numerical approach is the theory of Bayesian inference [42]. More general approaches are the DempsterShafer theory of evidence [51], and possibility theory [16], which is closely related to fuzzy systems.
Proactive algorithms for job shop scheduling with probabilistic durations
 Journal of Artificial Intelligence Research
"... Most classical scheduling formulations assume a fixed and known duration for each activity. In this paper, we weaken this assumption, requiring instead that each duration can be represented by an independent random variable with a known mean and variance. The best solutions are ones which have a hig ..."
Abstract

Cited by 24 (2 self)
 Add to MetaCart
(Show Context)
Most classical scheduling formulations assume a fixed and known duration for each activity. In this paper, we weaken this assumption, requiring instead that each duration can be represented by an independent random variable with a known mean and variance. The best solutions are ones which have a high probability of achieving a good makespan. We first create a theoretical framework, formally showing how Monte Carlo simulation can be combined with deterministic scheduling algorithms to solve this problem. We propose an associated deterministic scheduling problem whose solution is proved, under certain conditions, to be a lower bound for the probabilistic problem. We then propose and investigate a number of techniques for solving such problems based on combinations of Monte Carlo simulation, solutions to the associated deterministic problem, and either constraint programming or tabu search. Our empirical results demonstrate that a combination of the use of the associated deterministic problem and Monte Carlo simulation results in algorithms that scale best both in terms of problem size and uncertainty. Further experiments point to the correlation between the quality of the deterministic solution and the quality of the probabilistic solution as a major factor responsible for this success. 1.
Coarsening Approximations of Belief Functions
 In Proc. of ECSQARU'2001 (to appear
, 2001
"... A method is proposed for reducing the size of a frame of discernment, in such a way that the loss of information content in a set of belief functions is minimized. This approach allows to compute strong inner and outer approximations which can be combined efficiently using the Fast Möbius Transform ..."
Abstract

Cited by 19 (0 self)
 Add to MetaCart
A method is proposed for reducing the size of a frame of discernment, in such a way that the loss of information content in a set of belief functions is minimized. This approach allows to compute strong inner and outer approximations which can be combined efficiently using the Fast Möbius Transform algorithm.
Implementing general belief function framework with a practical codification for low complexity
 in <Advances and Applications of DSmT for Information Fusion
, 2009
"... In this chapter, we propose a new practical codification of the elements of the Venn diagram in order to easily manipulate the focal elements. In order to reduce the complexity, the eventual constraints must be integrated in the codification at the beginning. Hence, we only consider a reduced hyper ..."
Abstract

Cited by 10 (0 self)
 Add to MetaCart
(Show Context)
In this chapter, we propose a new practical codification of the elements of the Venn diagram in order to easily manipulate the focal elements. In order to reduce the complexity, the eventual constraints must be integrated in the codification at the beginning. Hence, we only consider a reduced hyper power set D Θ r that can be 2 Θ or D Θ. We describe all the steps of a general belief function framework. The step of decision is particularly studied, indeed, when we can decide on intersections of the singletons of the discernment space no actual decision functions are easily to use. Hence, two approaches are proposed, an extension of previous one and an approach based on the specificity of the elements on which to decide. The principal goal of this chapter is to provide practical codes of a general belief function framework for the researchers and users needing the belief function theory.
Job Shop Scheduling with Probabilistic Durations
"... Proactive approaches to scheduling take into account information about the execution time uncertainty in forming a schedule. In this paper, we investigate proactive approaches for the job shop scheduling problem where activity durations are random variables. The main contributions are (i) the introd ..."
Abstract

Cited by 10 (2 self)
 Add to MetaCart
Proactive approaches to scheduling take into account information about the execution time uncertainty in forming a schedule. In this paper, we investigate proactive approaches for the job shop scheduling problem where activity durations are random variables. The main contributions are (i) the introduction of the problem of finding probabilistic execution guarantees for difficult scheduling problems; (ii) a method for generating a lower bound on the minimal makespan; (iii) the development of the Monte Carlo approach for evaluating solutions; and (iv) the design and empirical analysis of three solution techniques: an approximately complete technique, found to be computationally feasible only for very small problems, and two techniques based on finding good solutions to a deterministic scheduling problem, which scale to much larger problems.
Inner And Outer Approximation Of Belief Structures Using A Hierarchical Clustering Approach
, 2001
"... ..."
Extended KNearest Neighbours Based on Evidence Theory
 The Computer Journal
, 2003
"... An evidence theoretic classification method is proposed in this paper. In order to classify a pattern we consider its neighbours, which are taken as parts of a single source of evidence to support the class membership of the pattern. A single mass function or basic belief assignment is then derived, ..."
Abstract

Cited by 7 (0 self)
 Add to MetaCart
(Show Context)
An evidence theoretic classification method is proposed in this paper. In order to classify a pattern we consider its neighbours, which are taken as parts of a single source of evidence to support the class membership of the pattern. A single mass function or basic belief assignment is then derived, and the belief function and the pignistic ("betting rates") probability function can be calculated. Then the (posterior) conditional pignistic probability function is calculated and used to decide the class label for the pattern.
Approximating the Combination of Belief Functions Using the Fast Möbius Transform in a Coarsened Frame
, 2002
"... A method is proposed for reducing the size of a frame of discernment, in such a way that the loss of information content in a set of belief functions is minimized. This method may be seen as a hierarchical clustering procedure applied to the columns of a binary data matrix, using a particular dissim ..."
Abstract

Cited by 3 (2 self)
 Add to MetaCart
A method is proposed for reducing the size of a frame of discernment, in such a way that the loss of information content in a set of belief functions is minimized. This method may be seen as a hierarchical clustering procedure applied to the columns of a binary data matrix, using a particular dissimilarity measure. It allows to compute approximations of the mass functions, which can be combined efficiently in the coarsened frame using the Fast Möbius Transform algorithm, yielding inner and outer approximations of the combined belief function.
Multimodal Belief Fusion for Face and Ear Biometrics
"... Abstract: This paper proposes a multimodal biometric system through Gaussian Mixture Model (GMM) for face and ear biometrics with belief fusion of the estimated scores characterized by Gabor responses and the proposed fusion is accomplished by DempsterShafer (DS) decision theory. Face and ear image ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
Abstract: This paper proposes a multimodal biometric system through Gaussian Mixture Model (GMM) for face and ear biometrics with belief fusion of the estimated scores characterized by Gabor responses and the proposed fusion is accomplished by DempsterShafer (DS) decision theory. Face and ear images are convolved with Gabor wavelet filters to extracts spatially enhanced Gabor facial features and Gabor ear features. Further, GMM is applied to the highdimensional Gabor face and Gabor ear responses separately for quantitive measurements. Expectation Maximization (EM) algorithm is used to estimate density parameters in GMM. This produces two sets of feature vectors which are then fused using DempsterShafer theory. Experiments are conducted on two multimodal databases, namely, IIT Kanpur database and virtual database. Former contains face and ear images of 400 individuals while later consist of both images of 17 subjects taken from BANCA face database and TUM ear database. It is found that use of Gabor wavelet filters along with GMM and DS theory can provide robust and efficient multimodal fusion strategy.
Modelling Uncertainty in Agent Programming
 IN: PROCEEDINGS OF THIRD INTERNATIONAL WORKSHOP ON DECLARATIVE AGENT LANGUAGES AND TECHNOLOGIES III (DALT ’05). VOLUME 3904 OF LNCS
, 2005
"... Existing cognitive agent programming languages that are based on the BDI model employ logical representation and reasoning for implementing the beliefs of agents. In these programming languages, the beliefs are assumed to be certain, i.e. an implemented agent can believe a proposition or not. These ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
(Show Context)
Existing cognitive agent programming languages that are based on the BDI model employ logical representation and reasoning for implementing the beliefs of agents. In these programming languages, the beliefs are assumed to be certain, i.e. an implemented agent can believe a proposition or not. These programming languages fail to capture the underlying uncertainty of the agent’s beliefs which is essential for many real world agent applications. We introduce DempsterShafer theory as a convenient method to model uncertainty in agent’s beliefs. We show that, using simple support functions as a representation for the agent’s beliefs, the computational complexity of Dempster’s Rule of Combination can be controlled. In particular, the certainty value of a proposition can be deduced from the beliefs of agents, without having to calculate the combination of DempsterShafer mass functions.