Results 11  20
of
1,647
Routines and other recurring action patterns of organizations: Contemporary research issues
 Industrial and Corporate Change
, 1996
"... This paper reports and extends discussions carried out during a workshop held at the Santa Fe Institute in August 1995 by the authors. It treats eight major topics: (i) the importance of carefully examining research on routine, (it) the concept of 'action patterns ' in general and in terms ..."
Abstract

Cited by 97 (13 self)
 Add to MetaCart
This paper reports and extends discussions carried out during a workshop held at the Santa Fe Institute in August 1995 by the authors. It treats eight major topics: (i) the importance of carefully examining research on routine, (it) the concept of 'action patterns ' in general and in terms of routine, (Hi) the useful categorization of routines and other recurring patterns, (iv) the research implications of recent cognitive results, (v) the relation of evolution to action patterns, (vi) the contributions of simulation modeling for theory in this area, (vii) examples of various approaches to empirical jj; research that reveal key problems, and (viii) a possible definition of 'routine'. An m extended appendix by Massimo Egidi provides a lexicon of synonyms and opposites ji covering use of the word 'routine ' in such areas as economics, organization theory and z artificial intelligence. 6
Scalable Computing
 Computer Science Today: Recent Trends and Developments
, 1996
"... . Scalable computing will, over the next few years, become the normal form of computing. In this paper we present a unified framework, based on the BSP model, which aims to serve as a foundation for this evolutionary development. A number of important techniques, tools and methodologies for the desi ..."
Abstract

Cited by 91 (3 self)
 Add to MetaCart
(Show Context)
. Scalable computing will, over the next few years, become the normal form of computing. In this paper we present a unified framework, based on the BSP model, which aims to serve as a foundation for this evolutionary development. A number of important techniques, tools and methodologies for the design of sequential algorithms and programs have been developed over the past few decades. In the transition from sequential to scalable computing we will find that new requirements such as universality and predictable performance will necessitate significant changes of emphasis in these areas. Programs for scalable computing, in addition to being fully portable, will have to be efficiently universal, offering high performance, in a predictable way, on any general purpose parallel architecture. The BSP model provides a discipline for the design of scalable programs of this kind. We outline the approach and discuss some of the issues involved. 1 Introduction For fifty years, sequential computin...
A DNA and restriction enzyme implementation of Turing Machines.
 DIMACS SERIES IN DISCRETE MATHEMATICS AND THEORETICAL COMPUTER SCIENCE
"... Bacteria employ restriction enzymes to cut or restrict DNA at or near specific words in a unique way. Many restriction enzymes cut the two strands of doublestranded DNA at different positions leaving overhangs of singlestranded DNA. Two pieces of DNA may be rejoined or ligated if their terminal ov ..."
Abstract

Cited by 86 (1 self)
 Add to MetaCart
Bacteria employ restriction enzymes to cut or restrict DNA at or near specific words in a unique way. Many restriction enzymes cut the two strands of doublestranded DNA at different positions leaving overhangs of singlestranded DNA. Two pieces of DNA may be rejoined or ligated if their terminal overhangs are complementary. Using these operations fragments of DNA, or oligonucleotides, may be inserted and deleted from a circular piece of plasmid DNA. We propose an encoding for the transition table of a Turing machine in DNA oligonucleotides and a corresponding series of restrictions and ligations of those oligonucleotides that, when performed on circular DNA encoding an instantaneous description of a Turing machine, simulate the operation of the Turing machine encoded in those oligonucleotides. DNA based Turing machines have been proposed by Charles Bennett but they invoke imaginary enzymes to perform the statesymbol transitions. Our approach differs in that every operation can be pe...
Computability theory
, 2004
"... Nature was computing long before humans started. It is the algorithmic content of the universe makes it an environment we can survive in. On the other hand, computation has been basic to civilisation from the earliest times. But computability? Computability theory is computation with consciousness, ..."
Abstract

Cited by 83 (6 self)
 Add to MetaCart
(Show Context)
Nature was computing long before humans started. It is the algorithmic content of the universe makes it an environment we can survive in. On the other hand, computation has been basic to civilisation from the earliest times. But computability? Computability theory is computation with consciousness, and entails the huge step from doing computation to observing and analysing the activity, and understanding something about what we can and cannot compute. And then — using the knowledge acquired as a stepping stone to a better understanding of the world we live in, and to new and previously unexpected computational strategies. It is relatively recently that computability graduated from being an essential element of our daily lives to being a concept one could talk about with precision. Computability as a theory originated with the work of Gödel, Turing, Church and others in the 1930s. The idea that reasoning might be essentially algorithmic goes back to Gottfried Leibniz — as he says in The Art of Discovery (1685), [24, p.51]:
Minimum Description Length Induction, Bayesianism, and Kolmogorov Complexity
 IEEE Transactions on Information Theory
, 1998
"... The relationship between the Bayesian approach and the minimum description length approach is established. We sharpen and clarify the general modeling principles MDL and MML, abstracted as the ideal MDL principle and defined from Bayes's rule by means of Kolmogorov complexity. The basic conditi ..."
Abstract

Cited by 82 (8 self)
 Add to MetaCart
The relationship between the Bayesian approach and the minimum description length approach is established. We sharpen and clarify the general modeling principles MDL and MML, abstracted as the ideal MDL principle and defined from Bayes's rule by means of Kolmogorov complexity. The basic condition under which the ideal principle should be applied is encapsulated as the Fundamental Inequality, which in broad terms states that the principle is valid when the data are random, relative to every contemplated hypothesis and also these hypotheses are random relative to the (universal) prior. Basically, the ideal principle states that the prior probability associated with the hypothesis should be given by the algorithmic universal probability, and the sum of the log universal probability of the model plus the log of the probability of the data given the model should be minimized. If we restrict the model class to the finite sets then application of the ideal principle turns into Kolmogorov's mi...
Evolving Algebras: An Attempt To Discover Semantics
, 1993
"... Machine (a virtual machine model which underlies most of the current Prolog implementations and incorporates crucial optimization techniques) starting from a more abstract EA for Prolog developed by Borger in [Bo1Bo3]. Q: How do you tailor an EA machine to the abstraction level of an algorithm wh ..."
Abstract

Cited by 81 (13 self)
 Add to MetaCart
Machine (a virtual machine model which underlies most of the current Prolog implementations and incorporates crucial optimization techniques) starting from a more abstract EA for Prolog developed by Borger in [Bo1Bo3]. Q: How do you tailor an EA machine to the abstraction level of an algorithm whose individual steps are complicated algorithms all by themselves? For example, the algorithm may be written in a high level language that allows, say, multiplying integer matrices in one step. A: You model the given algorithm modulo those algorithms needed to perform single steps. In your case, matrix multiplication will be built in as an operation. Q: Coming back to Turing, there could be a good reason for him to speak about computable functions rather than algorithms. We don't really know what algorithms are. A: I agree. Notice, however, that there are different notions of algorithm. On the one hand, an algorithm is an intuitive idea which you have in your head before writing code. Th...
Special Purpose Parallel Computing
 Lectures on Parallel Computation
, 1993
"... A vast amount of work has been done in recent years on the design, analysis, implementation and verification of special purpose parallel computing systems. This paper presents a survey of various aspects of this work. A long, but by no means complete, bibliography is given. 1. Introduction Turing ..."
Abstract

Cited by 81 (6 self)
 Add to MetaCart
A vast amount of work has been done in recent years on the design, analysis, implementation and verification of special purpose parallel computing systems. This paper presents a survey of various aspects of this work. A long, but by no means complete, bibliography is given. 1. Introduction Turing [365] demonstrated that, in principle, a single general purpose sequential machine could be designed which would be capable of efficiently performing any computation which could be performed by a special purpose sequential machine. The importance of this universality result for subsequent practical developments in computing cannot be overstated. It showed that, for a given computational problem, the additional efficiency advantages which could be gained by designing a special purpose sequential machine for that problem would not be great. Around 1944, von Neumann produced a proposal [66, 389] for a general purpose storedprogram sequential computer which captured the fundamental principles of...
What is the philosophy of information
 Metaphilosophy
, 2002
"... Andre Gide once wrote that one does not discover new lands without consenting to ..."
Abstract

Cited by 79 (5 self)
 Add to MetaCart
Andre Gide once wrote that one does not discover new lands without consenting to
On the Decision Problem for TwoVariable FirstOrder Logic
, 1997
"... We identify the computational complexity of the satisfiability problem for FO², the fragment of firstorder logic consisting of all relational firstorder sentences with at most two distinct variables. Although this fragment was shown to be decidable a long time ago, the computational complexity ..."
Abstract

Cited by 78 (1 self)
 Add to MetaCart
We identify the computational complexity of the satisfiability problem for FO², the fragment of firstorder logic consisting of all relational firstorder sentences with at most two distinct variables. Although this fragment was shown to be decidable a long time ago, the computational complexity of its decision problem has not been pinpointed so far. In 1975 Mortimer proved that FO² has the finitemodel property, which means that if an FO²sentence is satisfiable, then it has a finite model. Moreover, Mortimer showed that every satisfiable FO²sentence has a model whose size is at most doubly exponential in the size of the sentence. In this paper, we improve Mortimer's bound by one exponential and show that every satisfiable FO²sentence has a model whose size is at most exponential in the size of the sentence. As a consequence, we establish that the satisfiability problem for FO² is NEXPTIMEcomplete.