Results 1  10
of
23
Target Identification Based on the Transferable Belief Model Interpretation of DempsterShafer Model. Pars I: Methodology
, 2001
"... This paper explains how multisensor data fusion and target identification can be performed within the transferable belief model, a model for the representation of quantified uncertainty based on belief functions. The paper is presented in two parts: methodology and application. In this part, we pres ..."
Abstract

Cited by 21 (3 self)
 Add to MetaCart
This paper explains how multisensor data fusion and target identification can be performed within the transferable belief model, a model for the representation of quantified uncertainty based on belief functions. The paper is presented in two parts: methodology and application. In this part, we present the underlying theory, in particular the General Bayesian Theorem needed to transform likelihoods into beliefs and the pignistic transformation needed to build the probability measure required for decision making. We end with a simple example. More sophisticated examples and some comparative studies are presented in Part II. The results presented here can be extended directly to many problems of data fusion and diagnosis.
Coarsening Approximations of Belief Functions
 In Proc. of ECSQARU'2001 (to appear
, 2001
"... A method is proposed for reducing the size of a frame of discernment, in such a way that the loss of information content in a set of belief functions is minimized. This approach allows to compute strong inner and outer approximations which can be combined efficiently using the Fast Möbius Transform ..."
Abstract

Cited by 20 (0 self)
 Add to MetaCart
A method is proposed for reducing the size of a frame of discernment, in such a way that the loss of information content in a set of belief functions is minimized. This approach allows to compute strong inner and outer approximations which can be combined efficiently using the Fast Möbius Transform algorithm.
Matrix Calculus for Belief Functions
, 2001
"... The mathematic of belief functions can be handled by the use of the matrix notation. This representation helps greatly the user thanks to its notational simplicity and its efficiency for proving theorems. We show how to use them for several problems related to belief functions and the transferable b ..."
Abstract

Cited by 19 (0 self)
 Add to MetaCart
The mathematic of belief functions can be handled by the use of the matrix notation. This representation helps greatly the user thanks to its notational simplicity and its efficiency for proving theorems. We show how to use them for several problems related to belief functions and the transferable belief model.
Data Association in MultiTarget Detection Using the Transferable Belief Model.
 International Journal of Intelligent Systems
, 2000
"... In the transferable belief model, a model for the quantified representation of beliefs, some masses can be allocated to the empty set. It reflects the conflict between the sources of information. This quantified conflict can be used in order to solve the problem of data association in a multitarg ..."
Abstract

Cited by 18 (0 self)
 Add to MetaCart
In the transferable belief model, a model for the quantified representation of beliefs, some masses can be allocated to the empty set. It reflects the conflict between the sources of information. This quantified conflict can be used in order to solve the problem of data association in a multitarget detection problem. We present and illustrate the procedure by studying an example based on the detection of submarines. Their number and the association of each sensor to a particular source are determined by the procedure. Keywords: Transferable belief model, belief functions, data association, DempsterShafer theory, conflict of beliefs, fusion. 1 Introduction. Multisensor data fusion is the data processing function that combines data collected from systems comprising several sensors. These multisensor systems are characterized by the following features that must be taken into account: . the di#erent sensors observe the same scene, or at least partially (overlapping fields ...
Partial ordering of hyperpowersets and matrix representation of belief functions within DSmT
, 2003
"... ..."
Revising Beliefs Received from Multiple Sources
, 1999
"... : Since the seminal, philosophical and influential works of Alchourr'on, Gardenfors and Makinson, ideas on "belief revision" have been progressively refined toward normative, effective and computable paradigms. Side by side to this "symbolic" line of research, there has been also a "numerical" appro ..."
Abstract

Cited by 15 (4 self)
 Add to MetaCart
: Since the seminal, philosophical and influential works of Alchourr'on, Gardenfors and Makinson, ideas on "belief revision" have been progressively refined toward normative, effective and computable paradigms. Side by side to this "symbolic" line of research, there has been also a "numerical" approach to belief revision whose main contributes were the probabilistic and the evidencebased approaches. The opinion expressed in this paper is that, to be applied in a multisources environment, belief revision has to depart considerably from the original framework. In particular, it has to abandon the fundamental principle of "Priority to the Incoming Information" in preference to what we called the principle of "Recoverability". Furthermore the semantic approach should be blended with a syntactic treatment of consistency inspired by the Truth Maintenance Systems. 1 A BRIEF RETROSPECTIVE 1.1 The AGM paradigm During the last decade, the logical framework layed down by Alchourr'on, Gardenf...
Theory of evidence  a survey of its mathematical foundations, applications and computational aspects
 ZOR MATHEMATICAL METHODS OF OPERATIONS RESEARCH
, 1994
"... The mathematical theory of evidence has been introduced by Glenn Shafer in 1976 as a new approach to the representation of uncertainty. This theory can be represented under several distinct but more or less equivalent forms. Probabilistic interpretations of evidence theory have their roots in Arthur ..."
Abstract

Cited by 14 (0 self)
 Add to MetaCart
The mathematical theory of evidence has been introduced by Glenn Shafer in 1976 as a new approach to the representation of uncertainty. This theory can be represented under several distinct but more or less equivalent forms. Probabilistic interpretations of evidence theory have their roots in Arthur Dempster's multivalued mappings of probability spaces. This leads to random set and more generally to random lter models of evidence. In this probabilistic view evidence is seen as more or less probable arguments for certain hypotheses and they can be used to support those hypotheses to certain degrees. These degrees of support are in fact the reliabilities with which the hypotheses can be derived from the evidence. Alternatively, the mathematical theory of evidence can be founded axiomatically on the notion of belief functions or on the allocation of belief masses to subsets of a frame of discernment. These approaches aim to present evidence theory as an extension of probability theory. Evidence theory has been used to represent uncertainty in expert systems, especially in the domain of diagnostics. It can be applied to decision analysis and it gives a new perspective for statistical analysis. Among its further applications are image processing, project planing and scheduling and risk analysis. The computational problems of evidence theory
MonteCarlo methods make DempsterShafer formalism feasible
 In
, 1994
"... Abstract: One of the main obstacles to the applications of DempsterShafer formalism is its computational complexity. If we combine rn different pieces of knowledge, then in general case we have to perform up to 2 m computational steps, which for large m is infeasible. For several important cases al ..."
Abstract

Cited by 8 (3 self)
 Add to MetaCart
Abstract: One of the main obstacles to the applications of DempsterShafer formalism is its computational complexity. If we combine rn different pieces of knowledge, then in general case we have to perform up to 2 m computational steps, which for large m is infeasible. For several important cases algorithms with smaller running time have been proposed. We prove, however, that if we want to compute the belief bd(Q) in any given query Q, then exponential time is inevitable. It is still inevitable, if we want to compute bel(Q) with given precision e. This restriction corresponds to the natural idea that since initial masses are known only approximately, there is no sense in trying to compute beI(Q) precisely. A further idea is that there is always some doubt in the whole knowledge, so there is always a probability P0 that the expert's knowledge is wrong. In view of that it is sufficient to have an algorithm that gives a correct answer a probability> 1P0. If we use the original Dempster's combination rule, this possibility diminishes the running time, but still leaves the problem infeasible in the general case. We show that for the alternative combination rules proposed by Smets and Yager
Clustering decomposed belief functions using generalized weights of conflict
, 2008
"... We develop a method for clustering all types of belief functions, in particular nonconsonant belief functions. Such clustering is done when the belief functions concern multiple events, and all belief functions are mixed up. Clustering is performed by decomposing all belief functions into simple su ..."
Abstract

Cited by 6 (6 self)
 Add to MetaCart
We develop a method for clustering all types of belief functions, in particular nonconsonant belief functions. Such clustering is done when the belief functions concern multiple events, and all belief functions are mixed up. Clustering is performed by decomposing all belief functions into simple support and inverse simple support functions that are clustered based on their pairwise generalized weights of conflict, constrained by weights of attraction assigned to keep track of all decompositions. The generalized conflict c 2 ð 1; 1Þ and generalized weight of conflict J 2 ð 1; 1Þ are derived in the combination of simple support and inverse simple support functions.
Managing Decomposed Belief Functions
"... In this paper we develop a method for clustering all types of belief functions, in particular nonconsonant belief functions. Such clustering is done when the belief functions concern multiple events, and all belief functions are mixed up. Clustering is performed by decomposing all belief functions ..."
Abstract

Cited by 6 (4 self)
 Add to MetaCart
In this paper we develop a method for clustering all types of belief functions, in particular nonconsonant belief functions. Such clustering is done when the belief functions concern multiple events, and all belief functions are mixed up. Clustering is performed by decomposing all belief functions into simple support and inverse simple support functions that are clustered based on their pairwise generalized weights of conflict, constrained by weights of attraction assigned to keep track of all decompositions. The generalized conflict c ∈ (−∞, ∞) and generalized weight of conflict J − ∈ (−∞, ∞) are derived in the combination of simple support and inverse simple support functions.