Results 1  10
of
12
SelfOrganizing Maps In Natural Language Processing
, 1997
"... Kohonen's SelfOrganizing Map (SOM) is one of the most popular artificial neural network algorithms. Word category maps are SOMs that have been organized according to word similarities, measured by the similarity of the short contexts of the words. Conceptually interrelated words tend to fall into t ..."
Abstract

Cited by 38 (2 self)
 Add to MetaCart
Kohonen's SelfOrganizing Map (SOM) is one of the most popular artificial neural network algorithms. Word category maps are SOMs that have been organized according to word similarities, measured by the similarity of the short contexts of the words. Conceptually interrelated words tend to fall into the same or neighboring map nodes. Nodes may thus be viewed as word categories. Although no a priori information about classes is given, during the selforganizing process a model of the word classes emerges. The central topic of the thesis is the use of the SOM in natural language processing. The approach based on the word category maps is compared with the methods that are widely used in artificial intelligence research. Modeling gradience, conceptual change, and subjectivity of natural language interpretation are considered. The main application area is information retrieval and textual data mining for which a specific SOMbased method called the WEBSOM has been developed. The WEBSOM metho...
Tenacious Tortoises: A formalism for argument over rules of inference
 Computational Dialectics (ECAI 2000 Workshop
, 2000
"... As multiagent systems proliferate and employ different and more sophisticated formal logics, it is increasingly likely that agents will be reasoning with different rules of inference. Hence, an agent seeking to convince another of some proposition may first have to convince the latter to use a rule ..."
Abstract

Cited by 10 (7 self)
 Add to MetaCart
As multiagent systems proliferate and employ different and more sophisticated formal logics, it is increasingly likely that agents will be reasoning with different rules of inference. Hence, an agent seeking to convince another of some proposition may first have to convince the latter to use a rule of inference which it has not thus far adopted. We define a formalism to represent degrees of acceptability or validity of rules of inference, to enable autonomous agents to undertake dialogue concerning inference rules. Even when they disagree over the acceptability of a rule, two agents may still use the proposed formalism to reason collaboratively. 1
Reason and Rationality
"... Over the past few decades, reasoning and rationality have been the focus of enormous interdisciplinary attention, attracting interest from philosophers, psychologists, economists, statisticians and anthropologists, among others. The widespread interest in the topic reflects the central status of rea ..."
Abstract

Cited by 5 (2 self)
 Add to MetaCart
Over the past few decades, reasoning and rationality have been the focus of enormous interdisciplinary attention, attracting interest from philosophers, psychologists, economists, statisticians and anthropologists, among others. The widespread interest in the topic reflects the central status of reasoning in human affairs. But it also suggests that
The philosophical significance of Cox’s theorem
 International Journal of Approximate Reasoning
, 2004
"... Cox’s theorem states that, under certain assumptions, any measure of belief is isomorphic to a probability measure. This theorem, although intended as a justification of the subjectivist interpretation of probability theory, is sometimes presented as an argument for more controversial theses. Of par ..."
Abstract

Cited by 4 (2 self)
 Add to MetaCart
Cox’s theorem states that, under certain assumptions, any measure of belief is isomorphic to a probability measure. This theorem, although intended as a justification of the subjectivist interpretation of probability theory, is sometimes presented as an argument for more controversial theses. Of particular interest is the thesis that the only coherent means of representing uncertainty is via the probability calculus. In this paper I examine the logical assumptions of Cox’s theorem and I show how these impinge on the philosophical conclusions thought to be supported by the theorem. I show that the more controversial thesis is not supported by Cox’s theorem.
Fuzzy Concepts and Formal Methods: Some Illustrative Examples
 In Proceedings of the Seventh AsiaPacific Software Engineering Conference
, 1999
"... It has been recognised that formal methods are useful as a modelling tool in requirements engineering. Specification languages such as Z permit the precise and unambiguous modelling of system properties and behaviour. However some system problems, particularly those drawn from the IS problem domain, ..."
Abstract

Cited by 3 (3 self)
 Add to MetaCart
It has been recognised that formal methods are useful as a modelling tool in requirements engineering. Specification languages such as Z permit the precise and unambiguous modelling of system properties and behaviour. However some system problems, particularly those drawn from the IS problem domain, may be difficult to model in crisp or precise terms.
Fuzzy Epistemicism ∗
, 2007
"... Please do not cite or quote without permission. It is taken for granted in much of the literature on vagueness that semantic and epistemic approaches to vagueness are fundamentally at odds. If we can analyze borderline cases and the sorites paradox in terms of degrees of truth, then we don’t need an ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
Please do not cite or quote without permission. It is taken for granted in much of the literature on vagueness that semantic and epistemic approaches to vagueness are fundamentally at odds. If we can analyze borderline cases and the sorites paradox in terms of degrees of truth, then we don’t need an epistemic explanation. Conversely, if an epistemic explanation suffices, then there is no reason to depart from the familiar simplicity of classical bivalent semantics. I question this assumption, showing that there is an intelligible motivation for adopting a manyvalued semantics even if one accepts a form of epistemicism. The resulting hybrid view has advantages over both classical epistemicism and traditional manyvalued approaches. It is taken for granted in much of the literature on vagueness that semantic and epistemic approaches to vagueness are fundamentally at odds. If we can analyze borderline cases and the sorites paradox in terms of degrees of truth, then we don’t need an epistemic explanation. Conversely, if an epistemic explanation suffices, then there is no reason to depart from the familiar simplicity of classical bivalent semantics. Thus, while an epistemic approach to vagueness is not logically incompatible with the view that truth comes in degrees, it is usually assumed that there could be no motivation for combining the two.
The Universal Generalization Problem
 Logique & Analyse
, 2009
"... Gentzen gave different justifications of universal generalization. In particular, Gentzen’s justification is the one currently used in most logic textbooks. In this paper I argue that all such justifications are problematic, and propose an alternative justification which is related to the approach t ..."
Abstract

Cited by 1 (1 self)
 Add to MetaCart
Gentzen gave different justifications of universal generalization. In particular, Gentzen’s justification is the one currently used in most logic textbooks. In this paper I argue that all such justifications are problematic, and propose an alternative justification which is related to the approach to generality of Greek mathematics. 1.
IFSAEUSFLAT 2009 Around fuzziness. Some philosophical thoughts
"... Abstract — Fuzzy logic has a considerable success in the area of applications. However, this active growth in the implementation in applied electronic devices has not always been accompanied by a discussion about the theoretical grounds underlying those uses. This paper deals with some theoretical a ..."
Abstract
 Add to MetaCart
Abstract — Fuzzy logic has a considerable success in the area of applications. However, this active growth in the implementation in applied electronic devices has not always been accompanied by a discussion about the theoretical grounds underlying those uses. This paper deals with some theoretical aspects that seem convenient to discuss for the purpose of clarifying the status of fuzzy logic, to get into its true nature and to appoint its achievements and limits. In order to carry out this goal, the paper is organized around three questions: 1) Fuzzy logic, is it a logic or a technologic?; 2) How fuzzy is fuzzy computing?; 3) Are there fuzzy objects?
ijaras.isair.org FuzzyBased Pricing Models for New Products
"... ©International society of academic and industrial research www.isair.org ..."