Results 1  10
of
101
From Computing With Numbers To Computing With Words From Manipulation Of Measurements To Manipulation of Perceptions
 Appl. Math. Comput. Sci
"... Computing, in its usual sense, is centered on manipulation of numbers and symbols. In contrast, computing with words, or CW for short, is a methodology in which the objects of computation are words and propositions drawn from a natural language, e.g., small, large, far, heavy, not very likely, the p ..."
Abstract

Cited by 89 (3 self)
 Add to MetaCart
Computing, in its usual sense, is centered on manipulation of numbers and symbols. In contrast, computing with words, or CW for short, is a methodology in which the objects of computation are words and propositions drawn from a natural language, e.g., small, large, far, heavy, not very likely, the price of gas is low and declining, Berkeley is near San Francisco, it is very unlikely that there will be a significant increase in the price of oil in the near future, etc. Computing with words is inspired by the remarkable human capability to perform a wide variety of physical and mental tasks without any measurements and any computations. Familiar examples of such tasks are parking a car, driving in heavy traffic, playing golf, riding a bicycle, understanding speech and summarizing a story. Underlying this remarkable capability is the brain’s crucial ability to manipulate perceptions – perceptions of distance, size, weight, color, speed, time, direction, force, number, truth, likelihood and other characteristics of physical and mental objects. Manipulation of perceptions plays a key role in human recognition, decision and execution processes. As a methodology, computing with words provides a foundation for a computational theory of perceptions – a theory which may have an important bearing on how humans make – and machines might make – perceptionbased rational decisions in an environment of imprecision, uncertainty and partial truth. A basic difference between perceptions and measurements is that, in general, measurements are crisp whereas perceptions are fuzzy. One of the fundamental aims of science has been and continues to be that of progressing from perceptions to measurements. Pursuit of this aim has led to brilliant successes. We have sent men to the moon; we can build computers
Fuzzy functional dependencies and lossless join decomposition of fuzzy relational database systems
 ACM Transactions on Database Systems
, 1988
"... This paper deals with the application of fuzzy logic in a relational database environment with the objective of capturing more meaning of the data. It is shown that with suitable interpretations for the fuzzy membership functions, a fuzzy relational data model can be used to represent ambiguities in ..."
Abstract

Cited by 73 (0 self)
 Add to MetaCart
This paper deals with the application of fuzzy logic in a relational database environment with the objective of capturing more meaning of the data. It is shown that with suitable interpretations for the fuzzy membership functions, a fuzzy relational data model can be used to represent ambiguities in data values as well as impreciseness in the association among them. Relational operators for fuzzy relations have been studied, and applicability of fuzzy logic in capturing integrity constraints has been investigated. By introducing a fuzzy resemblance measure EQUAL for comparing domain values, the definition of classical functional dependency has been generalized to fuzzy functional dependency (ffd). The implication problem of ffds has been examined and a set of sound and complete inference axioms has been proposed. Next, the problem of lossless join decomposition of fuzzy relations for a given set of fuzzy functional dependencies is investigated. It is proved that with a suitable restriction on EQUAL, the design theory of a classical relational database with functional dependencies can be extended to fuzzy relations satisfying fuzzy functional dependencies.
Web Usage Mining: Discovery and Application of Interestin Patterns from Web Data
, 2000
"... Web Usage Mining is the application of data mining techniques to Web clickstream data in order to extract usage patterns. As Web sites continue to grow in size and complexity, the results of Web Usage Mining have become critical for a number of applications such as Web site design, business and mark ..."
Abstract

Cited by 72 (0 self)
 Add to MetaCart
Web Usage Mining is the application of data mining techniques to Web clickstream data in order to extract usage patterns. As Web sites continue to grow in size and complexity, the results of Web Usage Mining have become critical for a number of applications such as Web site design, business and marketing decision support, personalization, usability studies, and network trac analysis. The two major challenges involved in Web Usage Mining are preprocessing the raw data to provide an accurate picture of how a site is being used, and ltering the results of the various data mining algorithms in order to present only the rules and patterns that are potentially interesting. This thesis develops and tests an architecture and algorithms for performing Web Usage Mining. An evidence combination framework referred to as the information lter is developed to compare and combine usage, content, and structure information about a Web site. The information lter automatically identi es the discovered ...
Discovery of Interesting Usage Patterns from Web Data
 Advances in Web Usage Analysis and User Profiling. LNAI 1836
, 1999
"... . Web Usage Mining is the application of data mining techniques to large Web data repositories in order to extract usage patterns. As with many data mining application domains, the identification of patterns that are considered interesting is a problem that must be solved in addition to simply g ..."
Abstract

Cited by 51 (1 self)
 Add to MetaCart
. Web Usage Mining is the application of data mining techniques to large Web data repositories in order to extract usage patterns. As with many data mining application domains, the identification of patterns that are considered interesting is a problem that must be solved in addition to simply generating them. A necessary step in identifying interesting results is quantifying what is considered uninteresting in order to form a basis for comparison. Several research efforts have relied on manually generated sets of uninteresting rules. However, manual generation of a comprehensive set of evidence about beliefs for a particular domain is impractical in many cases. Generally, domain knowledge can be used to automatically create evidence for or against a set of beliefs. This paper develops a quantitative model based on support logic for determining the interestingness of discovered patterns. For Web Usage Mining, there are three types of domain information available; usage, co...
Soft Computing: the Convergence of Emerging Reasoning Technologies
 Soft Computing
, 1997
"... The term Soft Computing (SC) represents the combination of emerging problemsolving technologies such as Fuzzy Logic (FL), Probabilistic Reasoning (PR), Neural Networks (NNs), and Genetic Algorithms (GAs). Each of these technologies provide us with complementary reasoning and searching methods to so ..."
Abstract

Cited by 50 (8 self)
 Add to MetaCart
The term Soft Computing (SC) represents the combination of emerging problemsolving technologies such as Fuzzy Logic (FL), Probabilistic Reasoning (PR), Neural Networks (NNs), and Genetic Algorithms (GAs). Each of these technologies provide us with complementary reasoning and searching methods to solve complex, realworld problems. After a brief description of each of these technologies, we will analyze some of their most useful combinations, such as the use of FL to control GAs and NNs parameters; the application of GAs to evolve NNs (topologies or weights) or to tune FL controllers; and the implementation of FL controllers as NNs tuned by backpropagationtype algorithms.
PULCINELLA  A General Tool for Propagating Uncertainty in Valuation Networks
 PROC. 7TH CONF. ON UNCERTAINTY IN AI, 323331
, 1991
"... We present PULCinella and its use in comparing uncertainty theories. PULCinella is a general tool for Propagating Uncertainty based on the Local Computation technique of Shafer and Shenoy. It may be specialized to different uncertainty theories: at the moment, Pulcinella can propagate probabilities, ..."
Abstract

Cited by 47 (1 self)
 Add to MetaCart
We present PULCinella and its use in comparing uncertainty theories. PULCinella is a general tool for Propagating Uncertainty based on the Local Computation technique of Shafer and Shenoy. It may be specialized to different uncertainty theories: at the moment, Pulcinella can propagate probabilities, belief functions, Boolean values, and possibilities. Moreover, Pulcinella allows the user to easily define his own specializations. To illustrate Pulcinella, we analyze two examples by using each of the four theories above. In the first one, we mainly focus on intrinsic differences between theories. In the second one, we take a knowledge engineer viewpoint, and check the adequacy of each theory to a given problem.
Toward a generalized theory of uncertainty (GTU)An outline
 Information Sciences
, 2005
"... It is a deepseated tradition in science to view uncertainty as a province of probability theory. The generalized theory of uncertainty (GTU) which is outlined in this paper breaks with this tradition and views uncertainty in a much broader perspective. Uncertainty is an attribute of information. A ..."
Abstract

Cited by 39 (1 self)
 Add to MetaCart
It is a deepseated tradition in science to view uncertainty as a province of probability theory. The generalized theory of uncertainty (GTU) which is outlined in this paper breaks with this tradition and views uncertainty in a much broader perspective. Uncertainty is an attribute of information. A fundamental premise of GTU is that information, whatever its form, may be represented as what is called a generalized constraint. The concept of a generalized constraint is the centerpiece of GTU. In GTU, a probabilistic constraint is viewed as a special––albeit important––instance of a generalized constraint. A generalized constraint is a constraint of the form X isr R, where X is the constrained variable, R is a constraining relation, generally nonbivalent, and r is an indexing variable which identifies the modality of the constraint, that is, its semantics. The
Query evaluation in probabilistic relational databases
 Theoretical Computer Science
, 1997
"... This paper describes a generalization of the relational model in order to capture and manipulate a type of probabilistic information. Probabilistic databases are formalized by means of logic theories based on a probabilistic firstorder language proposed by Halpern. A sound a complete method is desc ..."
Abstract

Cited by 32 (0 self)
 Add to MetaCart
This paper describes a generalization of the relational model in order to capture and manipulate a type of probabilistic information. Probabilistic databases are formalized by means of logic theories based on a probabilistic firstorder language proposed by Halpern. A sound a complete method is described for evaluating queries in probabilistic theories. The generalization proposed can be incorporated into existing relational systems with the addition of a component for manipulating propositional formulas. 1
What Are Fuzzy Rules and How to Use Them
 Fuzzy Sets and Systems
, 1996
"... Fuzzy rules have been advocated as a key tool for expressing pieces of knowledge in "fuzzy logic". However, there does not exist a unique kind of fuzzy rules, nor is there only one type of "fuzzy logic". This diversity has caused many a misunderstanding in the literature of fuzzy control. The paper ..."
Abstract

Cited by 31 (12 self)
 Add to MetaCart
Fuzzy rules have been advocated as a key tool for expressing pieces of knowledge in "fuzzy logic". However, there does not exist a unique kind of fuzzy rules, nor is there only one type of "fuzzy logic". This diversity has caused many a misunderstanding in the literature of fuzzy control. The paper is a survey of different possible semantics for a fuzzy rule and shows how they can be captured in the framework of fuzzy set and possibility theory. It is pointed out that the interpretation of fuzzy rules dictates the way the fuzzy rules should be combined. The various kinds of fuzzy rules considered in the paper (gradual rules, certainty rules, possibility rules, and others) have different inference behaviors and correspond to various intended uses and applications. The representation of fuzzy unlessrules is briefly investigated on the basis of their intended meaning. The problem of defining and checking the coherence of a block of parallel fuzzy rules is also briefly addressed. This iss...
The Interpretation of Fuzziness
 IEEE Transactions on Systems, Man, and Cybernetics
, 1996
"... From laserscanned data to feature human model: a system based on ..."
Abstract

Cited by 25 (13 self)
 Add to MetaCart
From laserscanned data to feature human model: a system based on