Results 1  10
of
16
System Identification, Approximation and Complexity
 International Journal of General Systems
, 1977
"... This paper is concerned with establishing broadlybased systemtheoretic foundations and practical techniques for the problem of system identification that are rigorous, intuitively clear and conceptually powerful. A general formulation is first given in which two order relations are postulated on a ..."
Abstract

Cited by 34 (23 self)
 Add to MetaCart
This paper is concerned with establishing broadlybased systemtheoretic foundations and practical techniques for the problem of system identification that are rigorous, intuitively clear and conceptually powerful. A general formulation is first given in which two order relations are postulated on a class of models: a constant one of complexity; and a variable one of approximation induced by an observed behaviour. An admissible model is such that any less complex model is a worse approximation. The general problem of identification is that of finding the admissible subspace of models induced by a given behaviour. It is proved under very general assumptions that, if deterministic models are required then nearly all behaviours require models of nearly maximum complexity. A general theory of approximation between models and behaviour is then developed based on subjective probability concepts and semantic information theory The role of structural constraints such as causality, locality, finite memory, etc., are then discussed as rules of the game. These concepts and results are applied to the specific problem or stochastic automaton, or grammar, inference. Computational results are given to demonstrate that the theory is complete and fully operational. Finally the formulation of identification proposed in this paper is analysed in terms of Klir’s epistemological hierarchy and both are discussed in terms of the rich philosophical literature on the acquisition of knowledge. 1
On Equality Upto Constraints over Finite Trees, Context Unification, and OneStep Rewriting
"... We introduce equality upto constraints over finite trees and investigate their expressiveness. Equality upto constraints subsume equality constraints, subtree constraints, and onestep rewriting constraints. ..."
Abstract

Cited by 27 (7 self)
 Add to MetaCart
We introduce equality upto constraints over finite trees and investigate their expressiveness. Equality upto constraints subsume equality constraints, subtree constraints, and onestep rewriting constraints.
Periodicity on Partial Words
 Computers and Mathematics with Applications 47
, 2004
"... Codes play an important role in the study of combinatorics on words. Recently, we introduced pcodes that play a role in the study of combinatorics on partial words. Partial words are strings over a finite alphabet that may contain a number of “do not know ” symbols. In this paper, the theory of code ..."
Abstract

Cited by 22 (8 self)
 Add to MetaCart
Codes play an important role in the study of combinatorics on words. Recently, we introduced pcodes that play a role in the study of combinatorics on partial words. Partial words are strings over a finite alphabet that may contain a number of “do not know ” symbols. In this paper, the theory of codes of words is revisited starting from pcodes of partial words. We present some important properties of pcodes. We give several equivalent definitions of pcodes and the monoids they generate. We investigate in particular the Defect Theorem for partial words. We describe an algorithm to test whether or not a finite set of partial words is a pcode. We also discuss twoelement pcodes, complete pcodes, maximal pcodes, and the class of circular pcodes. A World Wide Web server interface has been established at
Exact Exploration and Hanging Algorithms ⋆
"... Abstract. Recent analysis of sequential algorithms resulted in their axiomatization and in a representation theorem stating that, for any sequential algorithm, there is an abstract state machine (ASM) with the same states, initial states and state transitions. That analysis, however, abstracted from ..."
Abstract

Cited by 7 (7 self)
 Add to MetaCart
Abstract. Recent analysis of sequential algorithms resulted in their axiomatization and in a representation theorem stating that, for any sequential algorithm, there is an abstract state machine (ASM) with the same states, initial states and state transitions. That analysis, however, abstracted from details of intrastep computation, and the ASM, produced in the proof of the representation theorem, may and often does explore parts of the state unexplored by the algorithm. We refine the analysis, the axiomatization and the representation theorem. Emulating a step of the given algorithm, the ASM, produced in the proof of the new representation theorem, explores exactly the part of the state explored by the algorithm. That frugality pays off when state exploration is costly. The algorithm may be a highlevel specification, and a simple function call on the abstraction level of the algorithm may hide expensive interaction with the environment. Furthermore, the original analysis presumed that state functions are total. Now we allow state functions, including equality, to be partial so that a function call may cause the algorithm as well as the ASM to hang. Since the emulating ASM does not make any superfluous function calls, it hangs only if the algorithm does. [T]he monotony of equality can only lead us to boredom. —Francis Picabia 1
A Unified Language Processing Methodology
 Theoretical Computer Science
, 2001
"... This paper discusses a mathematical concept of language that models both artificial and natural languages and thus provides a framework for a unified language processing methodology. This concept of a language is regarded as a communication tool that allows language users to develop knowledges, whil ..."
Abstract

Cited by 5 (2 self)
 Add to MetaCart
This paper discusses a mathematical concept of language that models both artificial and natural languages and thus provides a framework for a unified language processing methodology. This concept of a language is regarded as a communication tool that allows language users to develop knowledges, while interacting with their universe of discourse, and to communicate with each other, while exchanging knowledges. Criteria for consistent usage of a language are established using a Galois connection between language syntax and language semantics. Solutions to ambiguity, paraphrase, attitude, and other problems concerning the relationship between syntax and semantics are addressed. A general schema for language specification is introduced and algorithms that perform language generation and language analysis are discussed as universal tools defined by the specification schema. Language transformations performed by various kinds of translators are examined and correctness criteria of these translators are defined using the language Galois connection. The paper is structured as follows: Section 1 introduces the framework and justifies the necessity of a unified methodology for language processing. Section 2 presents the mathematical concept of a language. Section 3 illustrates the mathematical concept of a language with three kinds of language structures: natural language, logical language, and programming language. Section 4 discusses the algebraic mechanism of language specification that unifies the methodology for language processing tool development. Section 5 formalizes the criterion for the consistency of the language usage, defines the architecture of a unified language processing system, and shows how the consistency criteria for language usage can be employed as correct...
Applications of Diagrams to Decision Problems
, 1993
"... Classical decision problems such as the word and conjugacy problem are introduced and methods are given for solving them in certain cases. All the methods we present involve VanKampen diagrams as one of the most powerful tools when dealing with the classical decision problems. 1. Introduction In ..."
Abstract

Cited by 3 (2 self)
 Add to MetaCart
Classical decision problems such as the word and conjugacy problem are introduced and methods are given for solving them in certain cases. All the methods we present involve VanKampen diagrams as one of the most powerful tools when dealing with the classical decision problems. 1. Introduction In 1912 Max Dehn formulated in his article ,, Uber unendliche diskontinuierliche Gruppen" ("On infinite discontinuous groups") three fundamental problems for infinite groups given by finite presentations: the identity problem, the transformation problem, and the isomorphism problem. The following is a translation of Dehn's definition of the first two problems called in modern terms the word problem and the conjugacy problem: The identity problem (word problem): Let an arbitrary element of the group be given as a product of the generators. Find a method to decide in a finite number of steps whether or not this element equals the identity element. The transformation problem (conjugacy proble...
Equations on partial words
 MFCS 2006 31st International Symposium on Mathematical Foundations of Computer Science, Lecture Notes in Computer Science
, 2006
"... It is well known that some of the most basic properties of words, like the commutativity (xy = yx) and the conjugacy (xz = zy), can be expressed as solutions of word equations. An important problem is to decide whether or not a given equation on words has a solution. For instance, the equation x m y ..."
Abstract

Cited by 3 (2 self)
 Add to MetaCart
It is well known that some of the most basic properties of words, like the commutativity (xy = yx) and the conjugacy (xz = zy), can be expressed as solutions of word equations. An important problem is to decide whether or not a given equation on words has a solution. For instance, the equation x m y n = z p has only periodic solutions in a free monoid, that is, if x m y n = z p holds with integers m, n, p ≥ 2, then there exists a word w such that x, y, z are powers of w. This result, which received a lot of attention, was first proved by Lyndon and Schützenberger for free groups. In this paper, we investigate equations on partial words. Partial words are sequences over a finite alphabet that may contain a number of “do not know ” symbols. When we speak about equations on partial words, we replace the notion of equality (=) with compatibility (↑). Among other equations, we solve xy ↑ yx, xz ↑ zy, and special cases of x m y n ↑ z p for integers m, n, p ≥ 2.
An Algorithm for the Solution of Tree Equations
 Proceedings CAAPTAPSOFT'97. Lecture Notes in Computer Science n.1214
, 1997
"... . We consider the problem of solving equations over kary trees. Here an equation is a pair of labeled ffary trees, where ff is a function associating an arity to each label. A solution to an equation is a morphism from ffary trees to kary trees that maps the left and right hand side of the equa ..."
Abstract

Cited by 2 (2 self)
 Add to MetaCart
. We consider the problem of solving equations over kary trees. Here an equation is a pair of labeled ffary trees, where ff is a function associating an arity to each label. A solution to an equation is a morphism from ffary trees to kary trees that maps the left and right hand side of the equation to the same kary tree. This problem is a generalization of the word unification problem posed by A. Markov in the fifties, which corresponds to the case k=1, (in this case also the arity function ff must be identically equal to 1, and equations are pairs of words). The word unification problem was solved in two steps. First in 1976 Makanin proved the decidability of the existence of a solution to a word equation, and more recently in 1990 Jaffar gave an algorithm that finds the set of all principal solutions to a word equation when this set is finite. In this paper we solve the ffary tree equation problem for all other k ? 1. We describe an efficient unification algorithm that on inpu...
Verification as specialization of interpreters with respect to data
 FIRST INTERNATIONAL WORKSHOP ON METACOMPUTATION IN RUSSIA (META 2008)
, 2008
"... In the paper we explain the technique of verification via supercompliation taking as an example verification of the parameterised Load Balancing Monitor system. We demonstrate detailed executable specification of the Load Balancing Monitor protocol in a functional programming language REFAL and disc ..."
Abstract

Cited by 2 (0 self)
 Add to MetaCart
In the paper we explain the technique of verification via supercompliation taking as an example verification of the parameterised Load Balancing Monitor system. We demonstrate detailed executable specification of the Load Balancing Monitor protocol in a functional programming language REFAL and discuss the result of its supercompilation by the supercompiler SCP4. This case study is interesting both from the point of view of verification and program specialization. From the point of view of verification, a new type of nondeterminism is involved in the protocol, which has not been covered yet in previous applications of the technique. With regard to program specialization, we argued earlier that our approach to program verification may be seen as specialization of interpreters with respect to data [25]. We showed that by supercompilation of an interpreter of a simplest purely imperative programming language. The language corresponding to the Load Balancing Monitor protocol that we consider here has some features both of imperative and functional languages.
LifeLike Computing Beyond the Machine Metaphor
 In: R. Paton
, 1993
"... Introduction The question, what models, if any, can serve to represent the complexity of life, is a very important one. The application of biological ideas to novel software or hardware designs, or the seemingly opposite but in effect closely related task of using existing computers for the study o ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
Introduction The question, what models, if any, can serve to represent the complexity of life, is a very important one. The application of biological ideas to novel software or hardware designs, or the seemingly opposite but in effect closely related task of using existing computers for the study of lifelike phenomena requires an at least partial clarification of what computers can do. The subject of study of this paper is a foundational question of this kind. Following a few earlier writings [Kampis 1991a, Kampis 1991b] we attempt here to give a short nontechnical summary for the nonspecialist of a set of general ideas about computer modelling, and to present an account of an operational modelling methodology for dealing with models of life, and in particular, with models of evolving systems. Evolvability is perhaps the most distinctive characteristic of living systems. Many biologists like J. Maynard Smith [1975, 1986] or R. Dawkins [1985] consider this to be the key to l