Results 11  20
of
55
Automated Reasoning with Uncertainties
, 1992
"... In this work we assume that uncertainty is a multifaceted concept which admits several different measures, and present a system for automated reasoning with multiple representations of uncertainty. Our focus is on problems which present more than one of these facets and therefore in which a multival ..."
Abstract

Cited by 7 (7 self)
 Add to MetaCart
In this work we assume that uncertainty is a multifaceted concept which admits several different measures, and present a system for automated reasoning with multiple representations of uncertainty. Our focus is on problems which present more than one of these facets and therefore in which a multivalued representation of uncertainty and the study of its possibility of computational realisation are important for designing and implementing knowledgebased systems. We present a case study on developing a computational language for reasoning with uncertainty, starting with a semantically sound and computationally tractable language and gradually extending it with specialised syntactic constructs to represent measures of uncertainty, preserving its unambiguous semantic characterisation and computability properties. Our initial language is the language of normal clauses with SLDNF as the inference rule, and we select three facets of uncertainty, which are not exhaustive but cover many situations found in practical problems: vagueness, statistics and degrees of belief. To each of these facets we associate a specific measure: fuzzy measures to vagueness, probabilities on the domain to statistics and probabilities on possible worlds to degrees of belief. The resulting language is semantically sound and computationally tractable, and admits relatively efficient implementations employing ff \Gamma fi pruning and caching.
A Lukasiewicz Logic Based Prolog
 Mathware and Soft Computing
, 1994
"... Prolog is a programming language based on a restricted subset of classical first order predicate logic. In order to overcome some problems of classical logic to handle imperfect human knowledge, we provide a formal framework for a / Lukasiewicz logic based Prolog system. The use of / Lukasiewicz log ..."
Abstract

Cited by 5 (1 self)
 Add to MetaCart
Prolog is a programming language based on a restricted subset of classical first order predicate logic. In order to overcome some problems of classical logic to handle imperfect human knowledge, we provide a formal framework for a / Lukasiewicz logic based Prolog system. The use of / Lukasiewicz logic with its connection to Ulam games enables us to deal with partial inconsistencies by interpreting the truth values as relative distance to contradiction. We also present the software tool LULOG which is based on the theoretical results of this paper and can be seen as a Prolog system for manyvalued logic. Applications of LULOG to an Ulam game and an example of reasoning with imperfect knowledge are also discussed. 1 Introduction Classical logic provides a framework for the formulation and implementation of knowledge based systems. The programming language Prolog [6, 7] is based on a subset of first order predicate logic and thus a considerable number of artificial intelligence applicat...
Prolog Extensions to ManyValued Logics
 In
, 1992
"... The aim of this paper is to show that a restriction of a logical language to clauses like Horn clauses, as they are used in Prolog, applied to [0,1]valued logics leads to calculi with a sound and complete proof theory. In opposition to other models where generally the set of axioms as well as the ..."
Abstract

Cited by 3 (2 self)
 Add to MetaCart
The aim of this paper is to show that a restriction of a logical language to clauses like Horn clauses, as they are used in Prolog, applied to [0,1]valued logics leads to calculi with a sound and complete proof theory. In opposition to other models where generally the set of axioms as well as the deduction schemata are enriched we restrict ourselves to a simple modification of the deduction rules of classical logic without adding new axioms. In our model the truth values from the unit interval can be interpreted in a probabilistic sense, so that a value between 0 and 1 is not just intuitively interpreted as a `degree of truth'. Keywords: Prolog; [0,1]valued logic; probabilistic logic; possibilistic logic 1 Introduction N. Rescher [23] pointed that there are at least three different approaches to the field of manyvalued logic, namely ffl the metalogical viewpoint, which is mainly concerned with proof theoretic and algebraic aspects of logical systems as for example described in ...
Multivalued logics and fuzzy reasoning
 BCS AISB Summer School
, 1975
"... These notes are concerned with recent developments in multivalued logic, particularly in fuzzy logic and its status as a model for human linguistic reasoning. This first section discusses the status of formal logic and the need for logics of approximate reasoning with vague data. The following secti ..."
Abstract

Cited by 3 (2 self)
 Add to MetaCart
These notes are concerned with recent developments in multivalued logic, particularly in fuzzy logic and its status as a model for human linguistic reasoning. This first section discusses the status of formal logic and the need for logics of approximate reasoning with vague data. The following sections present a hasic account of fuzzy sets theory; fuzzy logics; Zadeh's model of linguistic hedges and fuzzy reasoning and finally a bibliography of all Zadeh's papers and other selected references. Models of the human reasoning process are clearly very relevant to artificial intelligence (AI) studies. Broadly there are two types: psychological models of what people actually do; and formal models of what logicians and philosophers feel a rational individual WOUld, or should, do. The main problem with the former is that it is extremely difficult to monitor thought processes the behaviorist approach is perhaps reasonable with rats but a ridiculously inadequate source of data on man the introspectionist approach is far more successful (e.g. in analysing human chess
Using FLOPER for Running/Debugging Fuzzy Logic
 Programs”, Proceedings of IPMU'08
, 2008
"... Fuzzy Logic Programming is an interesting and still growing research area that agglutinates the efforts for introducing fuzzy logic into logic programming, in order to incorporate more expressive resources on such languages for dealing with uncertainty and approximated reasoning. The multiadjoi ..."
Abstract

Cited by 2 (0 self)
 Add to MetaCart
Fuzzy Logic Programming is an interesting and still growing research area that agglutinates the efforts for introducing fuzzy logic into logic programming, in order to incorporate more expressive resources on such languages for dealing with uncertainty and approximated reasoning. The multiadjoint logic programming approach represents a recent and extremely flexible fuzzy logic paradigm for which, unfortunately, we have not found practical tools implemented so far. In this work we describe a prototype system, the FLOPER tool, which is able to directly translate fuzzy logic programs into Prolog code in order to safely execute these final programs inside any standard Prolog interpreter in a completely transparent way for the final user. The system also generates a lowlevel representation of the fuzzy code offering debugging (tracing) capabilities and opening the door to the design of more sophisticated program manipulation tasks such as program optimization, program specialization and so on. ∗This work has been partially supported by the
Combining fuzzy logic and behavioral similarity for nonstrict program validation
 In Proc. of the 8th Symp. on Principles and Practice of Declarative Programming
, 2006
"... The quality of an application’s implementation can be assured by validating the presence or absence of a set of userprescribed software patterns such as software engineering best practices, programming conventions and indications of poor programming. Existing pattern detection techniques, however, i ..."
Abstract

Cited by 2 (1 self)
 Add to MetaCart
The quality of an application’s implementation can be assured by validating the presence or absence of a set of userprescribed software patterns such as software engineering best practices, programming conventions and indications of poor programming. Existing pattern detection techniques, however, interpret pattern descriptions in an inflexible manner, leaving the quality assurance tool to approve only the most strictly adhering pattern implementations. In order to detect various concrete pattern implementations using a single pattern description, we have combined logic meta programming —wherein patterns can be expressed as constraints over facts representing a program’s source code—, fuzzy logic and static program analysis in a way that is completely transparant to the enduser. We have achieved this by having the conditions in a logic rule interpreted as constraints over the runtime behavior source code constructs give rise to instead of as constraints over the literal source description often suffices to recognize various concrete implementation variants with an indication of the similarity pattern description.
Representing Numeric Values in Concept Lattices
"... Abstract. Formal Concept Analysis is based on the occurrence of symbolic attributes in individual objects, or observations. But, when the attribute is numeric, treatment has been awkward. In this paper, we show how one can derive logical implications in which the atoms can be not only boolean symbol ..."
Abstract

Cited by 2 (0 self)
 Add to MetaCart
Abstract. Formal Concept Analysis is based on the occurrence of symbolic attributes in individual objects, or observations. But, when the attribute is numeric, treatment has been awkward. In this paper, we show how one can derive logical implications in which the atoms can be not only boolean symbolic attributes, but also ordinal inequalities, such as x ≤ 9. This extension to ordinal values is new. It employs the fact that orderings are antimatroid closure spaces. 1 Extending Formal Concept Analysis Formal Concept Analysis (FCA), which was initially developed by Rudolf Wille and Bernhard Ganter [3], provides a superb way of describing “concepts”, that is closed sets of attributes, or properties, within a context of occurrences, or objects. One can regard the concept as a closed set of objects with common attributes. Frequently clusters of these concepts, together with their structure, stand out with vivid clarity. However, two unresolved problems are often encountered. First, when concept lattices become large, it is hard to discern or describe significant clusters of related concepts. Gregor Snelting used formal concept analysis to analyze legacy code [6, 14]. 1 Snelting’s goal was to reconstruct the overall system structure by determining which variables (attributes or columns)
Graph Summarization in Annotated Data Using Probabilistic Soft Logic
"... Abstract. Annotation graphs, made available through the Linked Data initiative and Semantic Web, have significant scientific value. However, their increasing complexity makes it difficult to fully exploit this value. Graph summaries, which group similar entities and relations for a more abstract vie ..."
Abstract

Cited by 2 (1 self)
 Add to MetaCart
Abstract. Annotation graphs, made available through the Linked Data initiative and Semantic Web, have significant scientific value. However, their increasing complexity makes it difficult to fully exploit this value. Graph summaries, which group similar entities and relations for a more abstract view on the data, can help alleviate this problem, but new methods for graph summarization are needed that handle uncertainty present within and across these sources. Here, we propose the use of probabilistic soft logic (PSL) [1] as a general framework for reasoning about annotation graphs, similarities, and the possibly confounding evidence arising from these. We show preliminary results using two simple graph summarization heuristics in PSL for a plant biology domain. 1
Fuzzy Multidimensional Analysis and Resolution Operation. Computer Sci
 J. of Moldova
, 1998
"... In this paper a new original approach to the analysis of fuzzy multidimensional distributions is described. A uniform method for representing fuzzy multidimensional distributions by means of sectioned vectors and matrices is proposed. Sectioned matrix is interpreted as fuzzy conjunctive normal for ..."
Abstract

Cited by 2 (2 self)
 Add to MetaCart
In this paper a new original approach to the analysis of fuzzy multidimensional distributions is described. A uniform method for representing fuzzy multidimensional distributions by means of sectioned vectors and matrices is proposed. Sectioned matrix is interpreted as fuzzy conjunctive normal form, while its line vectors are interpreted as fuzzy disjunctions. Several useful characteristics of fuzzy distributions and disjunctions are de ned and studied. The main operation for manipulating fuzzy multidimensional distributions is an original fuzzy resolution which is applied to any two disjunctions on some variable and results in a third disjunction called resolvent. The property of adjacency of two disjunctions is de ned and the criterion of adjacency is formulated. It is shown that the proposed resolution operation is a generalization of the conventional resolution and the whole approach canbeviewed as a generalization of propositional logic. Methods for nding prime disjunctions, projection on a variable (thus solving the satis ability problem) and transforming into the dual form are proposed. 1
An Overview of the Theory of Relaxed Unification
, 2003
"... We give an overview of the Theory of Relaxed Unification—a novel theory that extends the classical Theory of Unification. Classical unification requires a perfect agreement between the terms being unified. In practice, data is seldom errorfree and can contain inconsistent information. Classical uni ..."
Abstract

Cited by 1 (1 self)
 Add to MetaCart
We give an overview of the Theory of Relaxed Unification—a novel theory that extends the classical Theory of Unification. Classical unification requires a perfect agreement between the terms being unified. In practice, data is seldom errorfree and can contain inconsistent information. Classical unification fails when the data is imperfect. We propose the Theory of Relaxed Unification as a new theory that relaxes the constraints of classical unification without requiring special preprocessing of data. Relaxed unification tolerates possible errors and inconsistencies in the data and facilitate reasoning under uncertainty. The Theory of Relaxed Unification is more general and has higher efficacy than the classical Theory of Unification. We present the fundamental concepts of relaxed unification, a relaxed unification algorithm, a number of examples.