Results 1  10
of
21
System Identification, Approximation and Complexity
 International Journal of General Systems
, 1977
"... This paper is concerned with establishing broadlybased systemtheoretic foundations and practical techniques for the problem of system identification that are rigorous, intuitively clear and conceptually powerful. A general formulation is first given in which two order relations are postulated on a ..."
Abstract

Cited by 34 (23 self)
 Add to MetaCart
This paper is concerned with establishing broadlybased systemtheoretic foundations and practical techniques for the problem of system identification that are rigorous, intuitively clear and conceptually powerful. A general formulation is first given in which two order relations are postulated on a class of models: a constant one of complexity; and a variable one of approximation induced by an observed behaviour. An admissible model is such that any less complex model is a worse approximation. The general problem of identification is that of finding the admissible subspace of models induced by a given behaviour. It is proved under very general assumptions that, if deterministic models are required then nearly all behaviours require models of nearly maximum complexity. A general theory of approximation between models and behaviour is then developed based on subjective probability concepts and semantic information theory The role of structural constraints such as causality, locality, finite memory, etc., are then discussed as rules of the game. These concepts and results are applied to the specific problem or stochastic automaton, or grammar, inference. Computational results are given to demonstrate that the theory is complete and fully operational. Finally the formulation of identification proposed in this paper is analysed in terms of Klir’s epistemological hierarchy and both are discussed in terms of the rich philosophical literature on the acquisition of knowledge. 1
New directions in the analysis and interactive elicitation of personal construct systems
 In
, 1981
"... The computer elicitation and analysis of personal construct systems has become a technique of great interest and wide application in recent years. This paper takes the current state of the art as a starting point and explores further developments that are natural extensions of it. The overall object ..."
Abstract

Cited by 33 (24 self)
 Add to MetaCart
The computer elicitation and analysis of personal construct systems has become a technique of great interest and wide application in recent years. This paper takes the current state of the art as a starting point and explores further developments that are natural extensions of it. The overall objective of the work described is to develop mancomputer symbiotic systems in which the computer is a truly dialectical partner to the person in forming theories and making decisions. A logical model of constructs as predicates applying to elements is used to develop a logical analysis of construct structures and this is contrasted with various distancebased clustering techniques. A grid analysis program called ENTAIL is described based on these techniques which derives a network of entailments from a grid. This is compared and contrasted with various programs for repertory grid analysis such as INGRID, FOCUS and QAnalysis. Entailment is discussed in relation to Kelly's superordination hierarchy over constructs and preference relations over elements. The entailment analysis is extended to ratingscale data using a fuzzy semantic model. The significance of Kelly's notion of the opposite to a construct as opposed to its negation is discussed and related to other epistemological models and the role of relevance. Finally, the interactive construct elicitation program PEGASUS is considered in terms of the psychological and philosophical importance of the dialectical processes of grid elicitation and analysis, and recommendations are made about its generalization and extension based on the logical foundations described. Links are established between the work on repertory grids and that on relational data bases and expert systems. 1.
Analysis of inconsistency in graphbased viewpoints
 In ASE
, 2003
"... Eliciting the requirements for a proposed system typically involves different stakeholders with different expertise, responsibilities, and perspectives. Viewpointsbased approaches have been proposed as a way to manage incomplete and inconsistent models gathered from multiple sources. In this paper, ..."
Abstract

Cited by 27 (11 self)
 Add to MetaCart
Eliciting the requirements for a proposed system typically involves different stakeholders with different expertise, responsibilities, and perspectives. Viewpointsbased approaches have been proposed as a way to manage incomplete and inconsistent models gathered from multiple sources. In this paper, we propose a categorytheoretic framework for the analysis of fuzzy viewpoints. Informally, a fuzzy viewpoint is a graph in which the elements of a lattice are used to specify the amount of knowledge available about the details of nodes and edges. By defining an appropriate notion of morphism between fuzzy viewpoints, we construct categories of fuzzy viewpoints and prove that these categories are (finitely) cocomplete. We then show how colimits can be employed to merge the viewpoints and detect the inconsistencies that arise independent of any particular choice of viewpoint semantics. We illustrate an application of the framework through a casestudy showing how fuzzy viewpoints can serve as a requirements elicitation tool in reactive systems. 1
An algebraic framework for merging incomplete and inconsistent views
, 2004
"... View merging, also called view integration, is a key problem in conceptual modeling. Large models are often constructed and accessed by manipulating individual views, but it is important to be able to consolidate a set of views to gain a unified perspective, to understand interactions between views, ..."
Abstract

Cited by 16 (5 self)
 Add to MetaCart
View merging, also called view integration, is a key problem in conceptual modeling. Large models are often constructed and accessed by manipulating individual views, but it is important to be able to consolidate a set of views to gain a unified perspective, to understand interactions between views, or to perform various types of endtoend analysis. View merging is complicated by incompleteness and inconsistency of views. Once views are merged, it is useful to be able to trace the elements of the merged view back to their sources. In this paper, we propose a framework for merging incomplete and inconsistent graphbased views. We introduce a formalism, called posetannotated graphs, which incorporates a systematic annotation scheme capable of modeling incompleteness and inconsistency as well as providing a builtin mechanism for ownership traceability. We show how structurepreserving maps can capture the relationships between disparate views modeled as posetannotated graphs, and provide a general algorithm for merging views with arbitrary interconnections. We use the i ∗ modeling language [26] as an example to demonstrate how our approach can be applied to existing graphbased modeling languages, especially in the domain of early Requirements Engineering. 1
Precise past  fuzzy future
 International Journal of ManMachine Studies
, 1983
"... This paper examines the motivation and foundations of fuzzy sets theory, now some 20 years old, particularly possible misconceptions about possible operators and relations to probability theory. It presents a standard uncertainty logic (SUI.) that subsumes standard propositional, fuzzy and probabili ..."
Abstract

Cited by 6 (4 self)
 Add to MetaCart
This paper examines the motivation and foundations of fuzzy sets theory, now some 20 years old, particularly possible misconceptions about possible operators and relations to probability theory. It presents a standard uncertainty logic (SUI.) that subsumes standard propositional, fuzzy and probability logics, and shows how many key results may be derived within SUL without further constraints. These include resolutions of standard paradoxes such as those of the bald man and of the barber, decision rules used in pattern recognition and control, the derivation of numeric truth values from the axiomatic form of the SUL, and the derivation of operators such as the arithmetic mean. The addition of the constraint of truthfunctionality to a SUL is shown to give fuzzy, or Lukasiewicz infinitelyvalued, logic. The addition of the constraint of the law of the excluded middle to a SUL is shown to give probability, or modal $5, logic. An example is given of the use of the two logics in combination to give a possibility vector when modelling sequential behaviour with uncertain observations.
Fuzzy reasoning and logics of uncertainty
 Proceedings of the sixth international symposium on Multiplevalued logic. IEEE Computer Society Press Los Alamitos
, 1976
"... This paper is concerned with the foundations of fuzzy reasoning and its relationships with other logics of uncertainty. The definitions of fuzzy logics are first examined and the role of fuzzification discussed. It is shown that fuzzification of PC gives a known multivalued logic but with inappropri ..."
Abstract

Cited by 5 (2 self)
 Add to MetaCart
This paper is concerned with the foundations of fuzzy reasoning and its relationships with other logics of uncertainty. The definitions of fuzzy logics are first examined and the role of fuzzification discussed. It is shown that fuzzification of PC gives a known multivalued logic but with inappropriate semantics of implication and various alternative forms of implication are discussed. In the main section the discussion is broadened to other logics of uncertainty and it is argued that there are close links, both formal and semantic, between fuzzy logic and probability logics. A basic multivalued logic is developed in terms of a truth function over a lattice of propositions that encompasses a wide range of logics of uncertainty. Various degrees of truth functionality are then defined and used to derive specific logics including probability logic and Lukasiewicz infinitely valued logic. Quantification and modal operators over the basic logic are introduced. Finally, a semantics for the basic logic is introduced in terms of a population (of events, or people, or neurons) and the semantic significance of the constraints giving rise to different logics is discussed. 1
Analogy categories, virtual machines and , ,structured programming
 GI2 Jahrestagung, Lecture Notes in Computer Science
, 1975
"... Abstract This paper arises from a number of studies of machine/problem relationships, software development techniques, language and machine design. It develops a categorytheoretic framework for the analysis of the relationships between programmer, virtual machine, and problem that are inherent in d ..."
Abstract

Cited by 5 (5 self)
 Add to MetaCart
Abstract This paper arises from a number of studies of machine/problem relationships, software development techniques, language and machine design. It develops a categorytheoretic framework for the analysis of the relationships between programmer, virtual machine, and problem that are inherent in discussions of "ease of programming", "good programming techniques", "structured programming", and so ono The concept of "analogy " is introduced as an expllcatum of the comprehensibility of the relationship between two systems. Analogy is given a formal definition in terms of a partially ordered structure of analogy categories whose minimal element is a "truth " c ~ "proof " category ° The theory is constructive and analogy relationships are computable between defined systems, c ~ classes of system. Thus the structures developed may be used to study the relationships between programmer, problem, and virtual machine in practical situations. io
Fuzzy object comparison and its application to a selfadaptable query mechanism
 In IFSA'95, volume I
, 1995
"... Abstract. This paper describes a generic strategy for “intelligently ” querying hierarchical semantic networks whose objects are characterized by fuzzy attributes. The strategy is selfadaptable in the sense that the network will keep structuring itself (as long as the user queries the system) by ad ..."
Abstract

Cited by 3 (2 self)
 Add to MetaCart
Abstract. This paper describes a generic strategy for “intelligently ” querying hierarchical semantic networks whose objects are characterized by fuzzy attributes. The strategy is selfadaptable in the sense that the network will keep structuring itself (as long as the user queries the system) by adding more and more detail to an underlying semilattice built on top of a (fuzzy) object comparison partial order. Technical bounds are imposed on the fuzziness of the system by formally specifying the network and reasoning about it. In the one direction, fuzziness is welcome insofar as it increases the flexibility and discriminating power of the comparison mechanism. In the opposite direction, arbitrary fuzziness may be counterproductive because it will end up destroying the formal properties desirable for such a mechanism. An experimental illustration of the proposed strategy is presented which reinterprets PrietoDiáz faceted (fuzzy) scheme for softwarecomponent classification and reuse. 1
Tossing Algebraic Flowers down the Great Divide
 In People and Ideas in Theoretical Computer Science
, 1999
"... Data Types and Algebraic Semantics The history of programming languages, and to a large extent of software engineering as a whole, can be seen as a succession of ever more powerful abstraction mechanisms. The first stored program computers were programmed in binary, which soon gave way to assembly l ..."
Abstract

Cited by 3 (0 self)
 Add to MetaCart
Data Types and Algebraic Semantics The history of programming languages, and to a large extent of software engineering as a whole, can be seen as a succession of ever more powerful abstraction mechanisms. The first stored program computers were programmed in binary, which soon gave way to assembly languages that allowed symbolic codes for operations and addresses. fortran began the spread of "high level" programming languages, though at the time it was strongly opposed by many assembly programmers; important features that developed later include blocks, recursive procedures, flexible types, classes, inheritance, modules, and genericity. Without going into the philosophical problems raised by abstraction (which in view of the discussion of realism in Section 4 may be considerable), it seems clear that the mathematics used to describe programming concepts should in general get more abstract as the programming concepts get more abstract. Nevertheless, there has been great resistance to u...
Interactive knowledge elicitation
 In Proceedings of CIPS SESSION 84
, 1984
"... This paper outlines a methodology for knowledge elicitation that has been implemented in a computer program that interacts with an expert to enable him to express the constructs underlying his knowledge. Experiments on validating the system by using it to reconstruct the Business Information Analys ..."
Abstract

Cited by 2 (1 self)
 Add to MetaCart
This paper outlines a methodology for knowledge elicitation that has been implemented in a computer program that interacts with an expert to enable him to express the constructs underlying his knowledge. Experiments on validating the system by using it to reconstruct the Business Information Analysis and Integration Technique (BIAIT) are reported. Cet article expose une méthodologie d’extraction de connaissance qui fut implémentee dans un programme informatique qui entre en interaction avec un expert en vue de lui permettre d’exprimer les concepts sousjacent à ses connaissances. Des experiences visant à valider le système en l’utilisant pour reconstruire le “Business Information Analysis and Integration Technique (BIAIT) ” sont raportées. Knowledge Engineering The initial success of expert system developments (Michie, 1979) and the development of a number of reasonably domainindependent software support systems for the encoding and application of knowledge (HayesRoth, Waterman and Lenat, 1983) has opened up the possibility of widespread usage of expert systems. In particular, the Japanese Fifth Generation Computing development program (Motooka, 1982) assumes this will happen and is targeted on knowledge processing rather than information processing. However, what Feigenbaum (1980)