Results 1  10
of
24
New directions in the analysis and interactive elicitation of personal construct systems
 In
, 1981
"... The computer elicitation and analysis of personal construct systems has become a technique of great interest and wide application in recent years. This paper takes the current state of the art as a starting point and explores further developments that are natural extensions of it. The overall object ..."
Abstract

Cited by 38 (25 self)
 Add to MetaCart
The computer elicitation and analysis of personal construct systems has become a technique of great interest and wide application in recent years. This paper takes the current state of the art as a starting point and explores further developments that are natural extensions of it. The overall objective of the work described is to develop mancomputer symbiotic systems in which the computer is a truly dialectical partner to the person in forming theories and making decisions. A logical model of constructs as predicates applying to elements is used to develop a logical analysis of construct structures and this is contrasted with various distancebased clustering techniques. A grid analysis program called ENTAIL is described based on these techniques which derives a network of entailments from a grid. This is compared and contrasted with various programs for repertory grid analysis such as INGRID, FOCUS and QAnalysis. Entailment is discussed in relation to Kelly's superordination hierarchy over constructs and preference relations over elements. The entailment analysis is extended to ratingscale data using a fuzzy semantic model. The significance of Kelly's notion of the opposite to a construct as opposed to its negation is discussed and related to other epistemological models and the role of relevance. Finally, the interactive construct elicitation program PEGASUS is considered in terms of the psychological and philosophical importance of the dialectical processes of grid elicitation and analysis, and recommendations are made about its generalization and extension based on the logical foundations described. Links are established between the work on repertory grids and that on relational data bases and expert systems. 1.
System Identification, Approximation and Complexity
 International Journal of General Systems
, 1977
"... This paper is concerned with establishing broadlybased systemtheoretic foundations and practical techniques for the problem of system identification that are rigorous, intuitively clear and conceptually powerful. A general formulation is first given in which two order relations are postulated on a ..."
Abstract

Cited by 36 (22 self)
 Add to MetaCart
(Show Context)
This paper is concerned with establishing broadlybased systemtheoretic foundations and practical techniques for the problem of system identification that are rigorous, intuitively clear and conceptually powerful. A general formulation is first given in which two order relations are postulated on a class of models: a constant one of complexity; and a variable one of approximation induced by an observed behaviour. An admissible model is such that any less complex model is a worse approximation. The general problem of identification is that of finding the admissible subspace of models induced by a given behaviour. It is proved under very general assumptions that, if deterministic models are required then nearly all behaviours require models of nearly maximum complexity. A general theory of approximation between models and behaviour is then developed based on subjective probability concepts and semantic information theory The role of structural constraints such as causality, locality, finite memory, etc., are then discussed as rules of the game. These concepts and results are applied to the specific problem or stochastic automaton, or grammar, inference. Computational results are given to demonstrate that the theory is complete and fully operational. Finally the formulation of identification proposed in this paper is analysed in terms of Klir’s epistemological hierarchy and both are discussed in terms of the rich philosophical literature on the acquisition of knowledge. 1
Analysis of inconsistency in graphbased viewpoints
 In ASE
, 2003
"... Eliciting the requirements for a proposed system typically involves different stakeholders with different expertise, responsibilities, and perspectives. Viewpointsbased approaches have been proposed as a way to manage incomplete and inconsistent models gathered from multiple sources. In this paper, ..."
Abstract

Cited by 29 (11 self)
 Add to MetaCart
(Show Context)
Eliciting the requirements for a proposed system typically involves different stakeholders with different expertise, responsibilities, and perspectives. Viewpointsbased approaches have been proposed as a way to manage incomplete and inconsistent models gathered from multiple sources. In this paper, we propose a categorytheoretic framework for the analysis of fuzzy viewpoints. Informally, a fuzzy viewpoint is a graph in which the elements of a lattice are used to specify the amount of knowledge available about the details of nodes and edges. By defining an appropriate notion of morphism between fuzzy viewpoints, we construct categories of fuzzy viewpoints and prove that these categories are (finitely) cocomplete. We then show how colimits can be employed to merge the viewpoints and detect the inconsistencies that arise independent of any particular choice of viewpoint semantics. We illustrate an application of the framework through a casestudy showing how fuzzy viewpoints can serve as a requirements elicitation tool in reactive systems. 1
An algebraic framework for merging incomplete and inconsistent views
, 2004
"... View merging, also called view integration, is a key problem in conceptual modeling. Large models are often constructed and accessed by manipulating individual views, but it is important to be able to consolidate a set of views to gain a unified perspective, to understand interactions between views, ..."
Abstract

Cited by 24 (7 self)
 Add to MetaCart
(Show Context)
View merging, also called view integration, is a key problem in conceptual modeling. Large models are often constructed and accessed by manipulating individual views, but it is important to be able to consolidate a set of views to gain a unified perspective, to understand interactions between views, or to perform various types of endtoend analysis. View merging is complicated by incompleteness and inconsistency of views. Once views are merged, it is useful to be able to trace the elements of the merged view back to their sources. In this paper, we propose a framework for merging incomplete and inconsistent graphbased views. We introduce a formalism, called posetannotated graphs, which incorporates a systematic annotation scheme capable of modeling incompleteness and inconsistency as well as providing a builtin mechanism for ownership traceability. We show how structurepreserving maps can capture the relationships between disparate views modeled as posetannotated graphs, and provide a general algorithm for merging views with arbitrary interconnections. We use the i ∗ modeling language [26] as an example to demonstrate how our approach can be applied to existing graphbased modeling languages, especially in the domain of early Requirements Engineering. 1
FUZZY SETS: HISTORY AND BASIC NOTIONS
"... This paper is an introduction to fuzzy set theory. It has several purposes. First, it tries to explain the emergence of fuzzy sets from an historical perspective. Looking back to the history of sciences, it seems that fuzzy sets were bound to appear at some point in the 20th century. Indeed, Zadeh&a ..."
Abstract

Cited by 17 (0 self)
 Add to MetaCart
This paper is an introduction to fuzzy set theory. It has several purposes. First, it tries to explain the emergence of fuzzy sets from an historical perspective. Looking back to the history of sciences, it seems that fuzzy sets were bound to appear at some point in the 20th century. Indeed, Zadeh's works have cristalized and popularized a concern that has appeared in the first half of the century, mainly in philosophical circles. Another purpose of the paper is to scan the basic definitions in the field, that are required for a proper reading of the rest of the volume, as well as the other volumes of the Handbooks of Fuzzy Sets Series. This Chapter also contains a discussion on operational semantics of the generally too abstract notion of membership function. Lastly, a survey of variants of fuzzy sets and related matters is provided.
AN OVERVIEW OF THE APPLICATIONS OF MULTISETS
, 2007
"... This paper presents a systemization of representation of multisets and basic operations under multisets, and an overview of the applications of multisets in mathematics, computer science and related areas. ..."
Abstract

Cited by 14 (1 self)
 Add to MetaCart
This paper presents a systemization of representation of multisets and basic operations under multisets, and an overview of the applications of multisets in mathematics, computer science and related areas.
Fuzzy reasoning and logics of uncertainty
 Proceedings of the sixth international symposium on Multiplevalued logic. IEEE Computer Society Press Los Alamitos
, 1976
"... This paper is concerned with the foundations of fuzzy reasoning and its relationships with other logics of uncertainty. The definitions of fuzzy logics are first examined and the role of fuzzification discussed. It is shown that fuzzification of PC gives a known multivalued logic but with inappropri ..."
Abstract

Cited by 8 (2 self)
 Add to MetaCart
(Show Context)
This paper is concerned with the foundations of fuzzy reasoning and its relationships with other logics of uncertainty. The definitions of fuzzy logics are first examined and the role of fuzzification discussed. It is shown that fuzzification of PC gives a known multivalued logic but with inappropriate semantics of implication and various alternative forms of implication are discussed. In the main section the discussion is broadened to other logics of uncertainty and it is argued that there are close links, both formal and semantic, between fuzzy logic and probability logics. A basic multivalued logic is developed in terms of a truth function over a lattice of propositions that encompasses a wide range of logics of uncertainty. Various degrees of truth functionality are then defined and used to derive specific logics including probability logic and Lukasiewicz infinitely valued logic. Quantification and modal operators over the basic logic are introduced. Finally, a semantics for the basic logic is introduced in terms of a population (of events, or people, or neurons) and the semantic significance of the constraints giving rise to different logics is discussed. 1
Precise past  fuzzy future
 International Journal of ManMachine Studies
, 1983
"... This paper examines the motivation and foundations of fuzzy sets theory, now some 20 years old, particularly possible misconceptions about possible operators and relations to probability theory. It presents a standard uncertainty logic (SUI.) that subsumes standard propositional, fuzzy and probabili ..."
Abstract

Cited by 6 (4 self)
 Add to MetaCart
This paper examines the motivation and foundations of fuzzy sets theory, now some 20 years old, particularly possible misconceptions about possible operators and relations to probability theory. It presents a standard uncertainty logic (SUI.) that subsumes standard propositional, fuzzy and probability logics, and shows how many key results may be derived within SUL without further constraints. These include resolutions of standard paradoxes such as those of the bald man and of the barber, decision rules used in pattern recognition and control, the derivation of numeric truth values from the axiomatic form of the SUL, and the derivation of operators such as the arithmetic mean. The addition of the constraint of truthfunctionality to a SUL is shown to give fuzzy, or Lukasiewicz infinitelyvalued, logic. The addition of the constraint of the law of the excluded middle to a SUL is shown to give probability, or modal $5, logic. An example is given of the use of the two logics in combination to give a possibility vector when modelling sequential behaviour with uncertain observations.
Tossing Algebraic Flowers down the Great Divide
 In People and Ideas in Theoretical Computer Science
, 1999
"... Data Types and Algebraic Semantics The history of programming languages, and to a large extent of software engineering as a whole, can be seen as a succession of ever more powerful abstraction mechanisms. The first stored program computers were programmed in binary, which soon gave way to assembly l ..."
Abstract

Cited by 6 (0 self)
 Add to MetaCart
(Show Context)
Data Types and Algebraic Semantics The history of programming languages, and to a large extent of software engineering as a whole, can be seen as a succession of ever more powerful abstraction mechanisms. The first stored program computers were programmed in binary, which soon gave way to assembly languages that allowed symbolic codes for operations and addresses. fortran began the spread of "high level" programming languages, though at the time it was strongly opposed by many assembly programmers; important features that developed later include blocks, recursive procedures, flexible types, classes, inheritance, modules, and genericity. Without going into the philosophical problems raised by abstraction (which in view of the discussion of realism in Section 4 may be considerable), it seems clear that the mathematics used to describe programming concepts should in general get more abstract as the programming concepts get more abstract. Nevertheless, there has been great resistance to u...
Analogy categories, virtual machines and , ,structured programming
 GI2 Jahrestagung, Lecture Notes in Computer Science
, 1975
"... Abstract This paper arises from a number of studies of machine/problem relationships, software development techniques, language and machine design. It develops a categorytheoretic framework for the analysis of the relationships between programmer, virtual machine, and problem that are inherent in d ..."
Abstract

Cited by 5 (5 self)
 Add to MetaCart
Abstract This paper arises from a number of studies of machine/problem relationships, software development techniques, language and machine design. It develops a categorytheoretic framework for the analysis of the relationships between programmer, virtual machine, and problem that are inherent in discussions of "ease of programming", "good programming techniques", "structured programming", and so ono The concept of "analogy " is introduced as an expllcatum of the comprehensibility of the relationship between two systems. Analogy is given a formal definition in terms of a partially ordered structure of analogy categories whose minimal element is a "truth " c ~ "proof " category ° The theory is constructive and analogy relationships are computable between defined systems, c ~ classes of system. Thus the structures developed may be used to study the relationships between programmer, problem, and virtual machine in practical situations. io