Results 21  30
of
1,764
Detecting Deception in Reputation Management
, 2003
"... We previously developed a social mechanism for distributed reputation management, in which an agent combines testimonies from several witnesses to determine its ratings of another agent. However, that approach does not fully protect against spurious ratings generated by malicious agents. This paper ..."
Abstract

Cited by 97 (3 self)
 Add to MetaCart
We previously developed a social mechanism for distributed reputation management, in which an agent combines testimonies from several witnesses to determine its ratings of another agent. However, that approach does not fully protect against spurious ratings generated by malicious agents. This paper focuses on the problem of deception in testimony propagation and aggregation. We introduce some models of deception and study how to efficiently detect deceptive agents following those models. Our approach involves a novel application of the wellknown weighted majority technique to belief function and their aggregation. We describe simulation experiments to study the number of apparently accurate witnesses found in different settings, the number of witnesses on prediction accuracy, and the evolution of trust networks.
From Computing With Numbers To Computing With Words From Manipulation Of Measurements To Manipulation of Perceptions
 Appl. Math. Comput. Sci
"... Computing, in its usual sense, is centered on manipulation of numbers and symbols. In contrast, computing with words, or CW for short, is a methodology in which the objects of computation are words and propositions drawn from a natural language, e.g., small, large, far, heavy, not very likely, the p ..."
Abstract

Cited by 89 (3 self)
 Add to MetaCart
Computing, in its usual sense, is centered on manipulation of numbers and symbols. In contrast, computing with words, or CW for short, is a methodology in which the objects of computation are words and propositions drawn from a natural language, e.g., small, large, far, heavy, not very likely, the price of gas is low and declining, Berkeley is near San Francisco, it is very unlikely that there will be a significant increase in the price of oil in the near future, etc. Computing with words is inspired by the remarkable human capability to perform a wide variety of physical and mental tasks without any measurements and any computations. Familiar examples of such tasks are parking a car, driving in heavy traffic, playing golf, riding a bicycle, understanding speech and summarizing a story. Underlying this remarkable capability is the brain’s crucial ability to manipulate perceptions – perceptions of distance, size, weight, color, speed, time, direction, force, number, truth, likelihood and other characteristics of physical and mental objects. Manipulation of perceptions plays a key role in human recognition, decision and execution processes. As a methodology, computing with words provides a foundation for a computational theory of perceptions – a theory which may have an important bearing on how humans make – and machines might make – perceptionbased rational decisions in an environment of imprecision, uncertainty and partial truth. A basic difference between perceptions and measurements is that, in general, measurements are crisp whereas perceptions are fuzzy. One of the fundamental aims of science has been and continues to be that of progressing from perceptions to measurements. Pursuit of this aim has led to brilliant successes. We have sent men to the moon; we can build computers
Perspectives on the Theory and Practice of Belief Functions
 International Journal of Approximate Reasoning
, 1990
"... The theory of belief functions provides one way to use mathematical probability in subjective judgment. It is a generalization of the Bayesian theory of subjective probability. When we use the Bayesian theory to quantify judgments about a question, we must assign probabilities to the possible answer ..."
Abstract

Cited by 86 (7 self)
 Add to MetaCart
The theory of belief functions provides one way to use mathematical probability in subjective judgment. It is a generalization of the Bayesian theory of subjective probability. When we use the Bayesian theory to quantify judgments about a question, we must assign probabilities to the possible answers to that question. The theory of belief functions is more flexible; it allows us to derive degrees of belief for a question from probabilities for a related question. These degrees of belief may or may not have the mathematical properties of probabilities; how much they differ from probabilities will depend on how closely the two questions are related. Examples of what we would now call belieffunction reasoning can be found in the late seventeenth and early eighteenth centuries, well before Bayesian ideas were developed. In 1689, George Hooper gave rules for combining testimony that can be recognized as special cases of Dempster's rule for combining belief functions (Shafer 1986a). Similar rules were formulated by Jakob Bernoulli in his Ars Conjectandi, published posthumously in 1713, and by JohannHeinrich Lambert in his Neues Organon, published in 1764 (Shafer 1978). Examples of belieffunction reasoning can also be found in more recent work, by authors
Toward normative expert systems: Part I. The pathfinder project
 Methods Inf. Med
, 1992
"... Pathfinder is an expert system that assists surgical pathologists with the diagnosis of lymphnode diseases. The program is one of a growing number of normative expert systems that use probability and decision theory to acquire, represent, manipulate, and explain uncertain medical knowledge. In this ..."
Abstract

Cited by 83 (15 self)
 Add to MetaCart
Pathfinder is an expert system that assists surgical pathologists with the diagnosis of lymphnode diseases. The program is one of a growing number of normative expert systems that use probability and decision theory to acquire, represent, manipulate, and explain uncertain medical knowledge. In this article, we describe Pathfinder and our research in uncertainreasoning paradigms that was stimulated by the development of the program. We discuss limitations with early decisiontheoretic methods for reasoning under uncertainty and our initial attempts to use nondecisiontheoretic methods. Then, we describe experimental and theoretical results that directed us to return to reasoning methods based in probability and decision theory.
Plausibility Measures and Default Reasoning
 Journal of the ACM
, 1996
"... this paper: default reasoning. In recent years, a number of different semantics for defaults have been proposed, such as preferential structures, fflsemantics, possibilistic structures, and rankings, that have been shown to be characterized by the same set of axioms, known as the KLM properties. W ..."
Abstract

Cited by 79 (12 self)
 Add to MetaCart
this paper: default reasoning. In recent years, a number of different semantics for defaults have been proposed, such as preferential structures, fflsemantics, possibilistic structures, and rankings, that have been shown to be characterized by the same set of axioms, known as the KLM properties. While this was viewed as a surprise, we show here that it is almost inevitable. In the framework of plausibility measures, we can give a necessary condition for the KLM axioms to be sound, and an additional condition necessary and sufficient to ensure that the KLM axioms are complete. This additional condition is so weak that it is almost always met whenever the axioms are sound. In particular, it is easily seen to hold for all the proposals made in the literature. Categories and Subject Descriptors: F.4.1 [Mathematical Logic and Formal Languages]:
The Concurrent Language Shared Prolog
, 1991
"... Shared Prolog is a new concurrent logic language. A Shared Prolog system is composed of a set of parallel agents which are Prolog programs extended by a guard mechanism. The programmer controls the granularity of parallelism coordinating communication and synchronization of the agents via a centrali ..."
Abstract

Cited by 74 (14 self)
 Add to MetaCart
Shared Prolog is a new concurrent logic language. A Shared Prolog system is composed of a set of parallel agents which are Prolog programs extended by a guard mechanism. The programmer controls the granularity of parallelism coordinating communication and synchronization of the agents via a centralized data structure. The communication mechanism is inherited from the blackboard model of problem solving. Intuitively, the granularity of the logic processes to be elaborated in parallel is large, while the resources shared on the blackboard can be very finegrained. An operational semantics for Shared Prolog is given in terms of a distributed model. Through an abstract notion of computation, the kinds of parallelism supported by the language as well as properties of infinite computations, such as local deadlocks, are studied. The expressiveness of the language is shown with respect to the specification of two classes of applications: metaprogramming and blackboard systems. Categories an...
Two views of belief: Belief as generalized probability and belief as evidence
, 1992
"... : Belief functions are mathematical objects defined to satisfy three axioms that look somewhat similar to the Kolmogorov axioms defining probability functions. We argue that there are (at least) two useful and quite different ways of understanding belief functions. The first is as a generalized prob ..."
Abstract

Cited by 72 (12 self)
 Add to MetaCart
: Belief functions are mathematical objects defined to satisfy three axioms that look somewhat similar to the Kolmogorov axioms defining probability functions. We argue that there are (at least) two useful and quite different ways of understanding belief functions. The first is as a generalized probability function (which technically corresponds to the inner measure induced by a probability function). The second is as a way of representing evidence. Evidence, in turn, can be understood as a mapping from probability functions to probability functions. It makes sense to think of updating a belief if we think of it as a generalized probability. On the other hand, it makes sense to combine two beliefs (using, say, Dempster's rule of combination) only if we think of the belief functions as representing evidence. Many previous papers have pointed out problems with the belief function approach; the claim of this paper is that these problems can be explained as a consequence of confounding the...
Web Usage Mining: Discovery and Application of Interestin Patterns from Web Data
, 2000
"... Web Usage Mining is the application of data mining techniques to Web clickstream data in order to extract usage patterns. As Web sites continue to grow in size and complexity, the results of Web Usage Mining have become critical for a number of applications such as Web site design, business and mark ..."
Abstract

Cited by 72 (0 self)
 Add to MetaCart
Web Usage Mining is the application of data mining techniques to Web clickstream data in order to extract usage patterns. As Web sites continue to grow in size and complexity, the results of Web Usage Mining have become critical for a number of applications such as Web site design, business and marketing decision support, personalization, usability studies, and network trac analysis. The two major challenges involved in Web Usage Mining are preprocessing the raw data to provide an accurate picture of how a site is being used, and ltering the results of the various data mining algorithms in order to present only the rules and patterns that are potentially interesting. This thesis develops and tests an architecture and algorithms for performing Web Usage Mining. An evidence combination framework referred to as the information lter is developed to compare and combine usage, content, and structure information about a Web site. The information lter automatically identi es the discovered ...
Some syntactic approaches to the handling of inconsistent knowledge bases: A comparative study  Part 1: The flat case
"... This paper presents and discusses several methods for reasoning from inconsistent knowledge bases. A socalled argued consequence relation, taking into account the existence of consistent arguments in favour of a conclusion and the absence of consistent arguments in favour of its contrary, is partic ..."
Abstract

Cited by 71 (12 self)
 Add to MetaCart
This paper presents and discusses several methods for reasoning from inconsistent knowledge bases. A socalled argued consequence relation, taking into account the existence of consistent arguments in favour of a conclusion and the absence of consistent arguments in favour of its contrary, is particularly investigated. Flat knowledge bases, i.e., without any priority between their elements, are studied under different inconsistencytolerant consequence relations, namely the socalled argumentative, free, universal, existential, cardinalitybased, and paraconsistent consequence relations. The syntaxsensitivity of these consequence relations is studied. A companion paper is devoted to the case where priorities exist between the pieces of information in the knowledge base. Key words: inconsistency, argumentation, nonmonotonic reasoning, syntaxsensitivity. * Some of the results contained in this paper were presented at the Ninth Conference on Uncertainty in Artificial Intelligence (UAI'...