Results 1  10
of
33
Toward robust integrated circuits: The embryonics approach
 Proceedings of the IEEE
, 2000
"... The growth and operation of all living beings are directed by the interpretation, in each of their cells, of a chemical program, the DNA string or genome. This process is the source of inspiration for the Embryonics (embryonic electronics) project, whose final objective is the design of highly robus ..."
Abstract

Cited by 56 (13 self)
 Add to MetaCart
The growth and operation of all living beings are directed by the interpretation, in each of their cells, of a chemical program, the DNA string or genome. This process is the source of inspiration for the Embryonics (embryonic electronics) project, whose final objective is the design of highly robust integrated circuits, endowed with properties usually associated with the living world: selfrepair (cicatrization) and selfreplication. The Embryonics architecture is based on four hierarchical levels of organization. 1) The basic primitive of our system is the molecule, a multiplexerbased element of a novel programmable circuit. 2) A finite set of molecules makes up a cell, essentially a small processor with an associated memory. 3) A finite set of cells makes up an organism, an applicationspecific multiprocessor system. 4) The organism can itself replicate, giving rise to a population of identical organisms. We begin by describing in detail the implementation of an artificial cell characterized by
Dynamic Dependency Grammar
 Linguistics and Philosophy
, 1994
"... this paper. Thanks are also due to Steve Pulman, Ewan Klein, David Beaver and Guy Barry for discussion during the early stages of the work, and to other members of the University of Edinburgh Centre for Cognitive Science and the University of Cambridge Computer Laboratory. The research was supported ..."
Abstract

Cited by 46 (4 self)
 Add to MetaCart
this paper. Thanks are also due to Steve Pulman, Ewan Klein, David Beaver and Guy Barry for discussion during the early stages of the work, and to other members of the University of Edinburgh Centre for Cognitive Science and the University of Cambridge Computer Laboratory. The research was supported by the British Science and Engineering Research Council (Research Fellowship B/90/ITF/288, and Research Grant RR30718)
Characterizing the Behavior of a Program Using MultipleLength Ngrams
, 2000
"... Some recent advances in intrusion detection are based on detecting anomalies in program behavior, as characterized by the sequence of kernel calls the program makes. Specifically, traces of kernel calls are collected during a training period. The substrings of fixed length N (for some N) of those tr ..."
Abstract

Cited by 36 (0 self)
 Add to MetaCart
Some recent advances in intrusion detection are based on detecting anomalies in program behavior, as characterized by the sequence of kernel calls the program makes. Specifically, traces of kernel calls are collected during a training period. The substrings of fixed length N (for some N) of those traces are called Ngrams. The set of Ngrams occurring during normal execution has been found to discriminate effectively between normal behavior of a program and the behavior of the program under attack. The Ngram characterization, while effective, requires the user to choose a suitable value for N. This paper presents an alternative characterization, as a finite state machine whose states represent predictive sequences of different lengths. An algorithm is presented to construct the finite state machine from training data, based on traditional stringprocessing data structures but employing some novel techniques.
System Identification, Approximation and Complexity
 International Journal of General Systems
, 1977
"... This paper is concerned with establishing broadlybased systemtheoretic foundations and practical techniques for the problem of system identification that are rigorous, intuitively clear and conceptually powerful. A general formulation is first given in which two order relations are postulated on a ..."
Abstract

Cited by 34 (23 self)
 Add to MetaCart
This paper is concerned with establishing broadlybased systemtheoretic foundations and practical techniques for the problem of system identification that are rigorous, intuitively clear and conceptually powerful. A general formulation is first given in which two order relations are postulated on a class of models: a constant one of complexity; and a variable one of approximation induced by an observed behaviour. An admissible model is such that any less complex model is a worse approximation. The general problem of identification is that of finding the admissible subspace of models induced by a given behaviour. It is proved under very general assumptions that, if deterministic models are required then nearly all behaviours require models of nearly maximum complexity. A general theory of approximation between models and behaviour is then developed based on subjective probability concepts and semantic information theory The role of structural constraints such as causality, locality, finite memory, etc., are then discussed as rules of the game. These concepts and results are applied to the specific problem or stochastic automaton, or grammar, inference. Computational results are given to demonstrate that the theory is complete and fully operational. Finally the formulation of identification proposed in this paper is analysed in terms of Klir’s epistemological hierarchy and both are discussed in terms of the rich philosophical literature on the acquisition of knowledge. 1
Toward a Viable, SelfReproducing Universal Computer
 Physica D
, 1996
"... Selfreproducing, cellular automatabased systems developed to date broadly fall under two categories; the first consists of machines which are capable of performing elaborate tasks, yet are too complex to simulate, while the second consists of extremely simple machines which can be entirely impleme ..."
Abstract

Cited by 20 (1 self)
 Add to MetaCart
Selfreproducing, cellular automatabased systems developed to date broadly fall under two categories; the first consists of machines which are capable of performing elaborate tasks, yet are too complex to simulate, while the second consists of extremely simple machines which can be entirely implemented, yet lack any additional functionality aside from selfreproduction. In this paper we present a selfreproducing system which is completely realizable, while capable of executing any desired program, thereby exhibiting universal computation. Our starting point is a simple selfreproducing loop structure onto which we "attach" an executable program (Turing machine) along with its data. The three parts of our system (loop, program, data) are all reproduced, after which the program is run on the given data. The system reported in this paper has been simulated in its entirety; thus, we attain a viable, selfreproducing machine with programmable capabilities. 1 Introduction The study of art...
On the Difficulty of Computations
, 1970
"... Two practical considerations concerning the use of computing machinery are the amount of information that must be given to the machine for it to perform a given task and the time it takes the machine to perform it. The size of programs and their running time are studied for mathematical models of co ..."
Abstract

Cited by 18 (4 self)
 Add to MetaCart
Two practical considerations concerning the use of computing machinery are the amount of information that must be given to the machine for it to perform a given task and the time it takes the machine to perform it. The size of programs and their running time are studied for mathematical models of computing machines. The study of the amount of information (i.e., number of bits) in a computer program needed for it to put out a given finite binary sequence leads to a definition of a random sequence; the random sequences of a given length are those that require the longest programs. The study of the running time of programs for computing infinite sets of natural numbers leads to an arithmetic of computers, which is a distributive lattice.
A Classical Automata Approach to Noninterference Type Problems
 In: The Computer Security Foundations Workshop V proceedings: June 1618, 1992, the Franconia Inn
, 1992
"... Using classical automata theory we show how noninterference can be viewed as a relatively simple phenomenon. We also give direction for future work concerning probabilistic security problems using classical automata theory. 1 Introduction Many models have been proposed to model a secure computer s ..."
Abstract

Cited by 8 (0 self)
 Add to MetaCart
Using classical automata theory we show how noninterference can be viewed as a relatively simple phenomenon. We also give direction for future work concerning probabilistic security problems using classical automata theory. 1 Introduction Many models have been proposed to model a secure computer system. Some of the representative early models are by Harrison et al [10], Denning [4], and the often mentioned BellLaPadula model [3]. Depending on how one interprets concepts such as "subject/user" and "object" it is not clear whether or not covert channels are taken into consideration in these models. Noninterference [6, 7] was a concrete approach at preventing improper information flow in a deterministic system. Nondeducibility [21] was a more abstract attempt at looking at possible nonsecure information flow in a secure system, i.e., a covert channel. Restrictiveness [12, 13] was ostensibly developed as a nondeterministic analog of noninterference to repair purported problems involved...
A model of database components and their interconnection based upon communicating views
 in: Hannu Jakkola, Yashui Kiyoki, and Takehiro Tokuda, eds., Information Modelling and Knowledge Systems XXIV, Frontiers in Artificial Intelligence and Applications, IOS Press, 2007, In
"... Abstract A formalism for constructing database schemata from simple components is presented in which the components are coupled to one another via communicating views. The emphasis is upon identifying the conditions under which such components can be interconnected in a conflictfree fashion, and a ..."
Abstract

Cited by 7 (5 self)
 Add to MetaCart
Abstract A formalism for constructing database schemata from simple components is presented in which the components are coupled to one another via communicating views. The emphasis is upon identifying the conditions under which such components can be interconnected in a conflictfree fashion, and a characterization of such, based upon the acyclicity of an underlying hypergraph, is obtained. The work is furthermore oriented towards an understanding of how updates can be supported within the componentbased framework, and initial ideas of socalled canonical liftings are presented. 1.
In Some Curved Spaces, One Can Solve NPHard Problems in Polynomial Time
"... In the late 1970s and the early 1980s, Yuri Matiyasevich actively used his knowledge of engineering and physical phenomena to come up with parallelized schemes for solving NPhard problems in polynomial time. In this paper, we describe one such scheme in which we use parallel computation in curved s ..."
Abstract

Cited by 6 (6 self)
 Add to MetaCart
In the late 1970s and the early 1980s, Yuri Matiyasevich actively used his knowledge of engineering and physical phenomena to come up with parallelized schemes for solving NPhard problems in polynomial time. In this paper, we describe one such scheme in which we use parallel computation in curved spaces. 1 Introduction and Formulation of the Problem Many practical problems are NPhard. It is well known that many important practical problems are NPhard; see, e.g., [11, 14, 27]. Under the usual hypothesis that P̸=NP, NPhardness has the following intuitive meaning: every algorithm which solves all instances of the corresponding problem requires, for
A Decomposition Theorem for Probabilistic Transition Systems
, 1995
"... In this paper we prove that every finite Markov chain can be decomposed into a cascade product of a Bernoulli process and several simple permutationreset deterministic automata. The original chain is a statehomomorphic image of the product. By doing so we give a positive answer to an open question ..."
Abstract

Cited by 4 (0 self)
 Add to MetaCart
In this paper we prove that every finite Markov chain can be decomposed into a cascade product of a Bernoulli process and several simple permutationreset deterministic automata. The original chain is a statehomomorphic image of the product. By doing so we give a positive answer to an open question stated in [Paz71] concerning the decomposability of probabilistic systems. Our result is based on the observation that in probabilistic transition systems, "randomness" and "memory" can be separated so as to allow the nonrandom part to be treated using common deterministic automatatheoretic techniques. The same separation technique can be applied to other kinds of nondeterminism as well. 1 Preliminaries The object of our study is a probabilistic inputoutput statetransition system. Its definition is not new and has appeared under various names in the past (e.g., [Arb68, Paz71, Sta72]). Definition 1 (Probabilistic Transition Systems) A probabilistic transition system (PTS) is a quadr...