Results 1  10
of
83
The induction of dynamical recognizers
 Machine Learning
, 1991
"... A higher order recurrent neural network architecture learns to recognize and generate languages after being "trained " on categorized exemplars. Studying these networks from the perspective of dynamical systems yields two interesting discoveries: First, a longitudinal examination of the learning pro ..."
Abstract

Cited by 210 (14 self)
 Add to MetaCart
A higher order recurrent neural network architecture learns to recognize and generate languages after being "trained " on categorized exemplars. Studying these networks from the perspective of dynamical systems yields two interesting discoveries: First, a longitudinal examination of the learning process illustrates a new form of mechanical inference: Induction by phase transition. A small weight adjustment causes a "bifurcation" in the limit behavior of the network. This phase transition corresponds to the onset of the network’s capacity for generalizing to arbitrarylength strings. Second, a study of the automata resulting from the acquisition of previously published training sets indicates that while the architecture is not guaranteed to find a minimal finite automaton consistent with the given exemplars, which is an NPHard problem, the architecture does appear capable of generating nonregular languages by exploiting fractal and chaotic dynamics. I end the paper with a hypothesis relating linguistic generative capacity to the behavioral regimes of nonlinear dynamical systems.
Models of Computation  Exploring the Power of Computing
"... Theoretical computer science treats any computational subject for which a good model can be created. Research on formal models of computation was initiated in the 1930s and 1940s by Turing, Post, Kleene, Church, and others. In the 1950s and 1960s programming languages, language translators, and oper ..."
Abstract

Cited by 56 (6 self)
 Add to MetaCart
Theoretical computer science treats any computational subject for which a good model can be created. Research on formal models of computation was initiated in the 1930s and 1940s by Turing, Post, Kleene, Church, and others. In the 1950s and 1960s programming languages, language translators, and operating systems were under development and therefore became both the subject and basis for a great deal of theoretical work. The power of computers of this period was limited by slow processors and small amounts of memory, and thus theories (models, algorithms, and analysis) were developed to explore the efficient use of computers as well as the inherent complexity of problems. The former subject is known today as algorithms and data structures, the latter computational complexity. The focus of theoretical computer scientists in the 1960s on languages is reflected in the first textbook on the subject, Formal Languages and Their Relation to Automata by John Hopcroft and Jeffrey Ullman. This influential book led to the creation of many languagecentered theoretical computer science courses; many introductory theory courses today continue to reflect the content of this book and the interests of theoreticians of the 1960s and early 1970s. Although
Evolution, Neural Networks, Games, and Intelligence
"... Intelligence pertains to the ability to make appropriate decisions in light of specific goals and to adapt behavior to meet those goals in a range of environments. Mathematical games provide a framework for studying intelligent behavior in models of realworld settings or restricted domains. The beh ..."
Abstract

Cited by 54 (1 self)
 Add to MetaCart
Intelligence pertains to the ability to make appropriate decisions in light of specific goals and to adapt behavior to meet those goals in a range of environments. Mathematical games provide a framework for studying intelligent behavior in models of realworld settings or restricted domains. The behavior of alternative strategies in these games is defined by each individual’s stimulusresponse mapping. Limiting these behaviors to linear functions of the environmental conditions renders the results to be little more than a façade: Effective decision making in any complex environment almost always requires nonlinear stimulusresponse mappings. The obstacle then comes in choosing the appropriate representation and learning algorithm. Neural networks and evolutionary algorithms provide useful means for addressing these issues. This paper describes efforts to hybridize neural and evolutionary computation to learn appropriate strategies in zeroand nonzerosum games, including the iterated prisoner’s dilemma, tictactoe, and checkers. With respect to checkers, the evolutionary algorithm was able to discover a neural network that can be used to play at a nearexpert level without injecting expert knowledge about how to play the game. The implications of evolutionary learning with respect to machine intelligence are also discussed. It is argued that evolution provides the framework for explaining naturally occurring intelligent entities and can be used to design machines that are also capable of intelligent behavior.
A Formal Model and Specification Language for Procedure Calling Conventions
 In Conference Record of the 22nd Annual ACM Symposium on Principles of Programming Languages
, 1994
"... Procedure calling conventions are used to provide uniform procedure call interfaces. Applications, such as compilers and debuggers, which generate, or process procedures at the machinelanguage abstraction level require knowledge of the calling convention. In this paper, we develop a formal model fo ..."
Abstract

Cited by 30 (9 self)
 Add to MetaCart
Procedure calling conventions are used to provide uniform procedure call interfaces. Applications, such as compilers and debuggers, which generate, or process procedures at the machinelanguage abstraction level require knowledge of the calling convention. In this paper, we develop a formal model for procedure calling conventions called PFSA's. Using this model, we are able to ensure several completeness and consistency properties of calling conventions. Currently, applications that manipulate procedures implement conventions in an adhoc manner. The resulting code is complicated with details, difficult to maintain, and often riddled with errors. To alleviate the situation, we introduce a calling convention specification language, called CCL. The combination of CCL and PFSA's facilitates the accurate specification of conventions that can be shown to be both consistent and complete. 1 Introduction Procedures, or functions, in programming languages work in concert to implement the in...
A Taxonomy for Spatiotemporal Connectionist Networks Revisited: The Unsupervised Case
 Neural Computation
, 2003
"... Spatiotemporal connectionist networks (STCN's) comprise an important class of neural models that can deal with patterns distributed both in time and space. In this paper, we widen the application domain of the taxonomy for supervised STCN's recently proposed by Kremer (2001) to the unsupervised case ..."
Abstract

Cited by 21 (1 self)
 Add to MetaCart
Spatiotemporal connectionist networks (STCN's) comprise an important class of neural models that can deal with patterns distributed both in time and space. In this paper, we widen the application domain of the taxonomy for supervised STCN's recently proposed by Kremer (2001) to the unsupervised case. This is possible through a reinterpretation of the state vector as a vector of latent (hidden) variables, as proposed by Meinicke (2000). The goal of this generalized taxonomy is then to provide a nonlinear generative framework for describing unsupervised spatiotemporal networks, making it easier to compare and contrast their representational and operational characteristics. Computational properties, representational issues and learning are also discussed and a number of references to the relevant source publications are provided. It is argued that the proposed approach is simple and more powerful than the previous attempts, from a descriptive and predictive viewpoint. We also discuss the relation of this taxonomy with automata theory and state space modeling, and suggest directions for further work.
TLA in Pictures
, 1995
"... Predicateaction diagrams, which are similar to standard statetransition diagrams, are precisely defined as formulas of TLA (the Temporal Logic of Actions). We explain how these diagrams can be used to describe aspects of a specificationand those descriptions then proved correct even when the ..."
Abstract

Cited by 19 (0 self)
 Add to MetaCart
Predicateaction diagrams, which are similar to standard statetransition diagrams, are precisely defined as formulas of TLA (the Temporal Logic of Actions). We explain how these diagrams can be used to describe aspects of a specificationand those descriptions then proved correct even when the complete specification cannot be written as a diagram. We also use the diagrams to illustrate proofs. KeywordsConcurrency, specification, statetransition diagrams, temporal logic. I. Introduction Pictures aid understanding. A simple flowchart is easier to understand than the equivalent programminglanguage text. However, complex pictures are confusing. A large, spaghettilike flowchart is harder to understand than a properly structured program text. Pictures are inadequate for specifying complex systems, but they can help us understand particular aspects of a system. For a picture to provide more than an informal comment, there must be a formal connection between the complete specific...
Weighted and probabilistic contextfree grammars are equally expressive
 Computational Linguistics
, 2007
"... This paper studies the relationship between weighted contextfree grammars (WCFGs), where each production is associated with a positive realvalued weight, and probabilistic contextfree grammars (PCFGs), where the weights of the productions associated with a nonterminal are constrained to sum to one ..."
Abstract

Cited by 14 (2 self)
 Add to MetaCart
This paper studies the relationship between weighted contextfree grammars (WCFGs), where each production is associated with a positive realvalued weight, and probabilistic contextfree grammars (PCFGs), where the weights of the productions associated with a nonterminal are constrained to sum to one. Since the class of WCFGs properly includes the PCFGs, one might expect that WCFGs can describe distributions that PCFGs cannot. However, Chi (1999) and Abney, McAllester, and Pereira (1999) proved that every WCFG distribution is equivalent to some PCFG distribution. We extend their results to conditional distributions, and show that every WCFG conditional distribution of parses given strings is also the conditional distribution defined by some PCFG, even when the WCFG’s partition function diverges. This shows that any parsing or labeling accuracy improvement from conditional estimation of WCFGs or CRFs over joint estimation of PCFGs or HMMs is due to the estimation procedure rather than the change in model class, since PCFGs and HMMs are exactly as expressive as WCFGs and chainstructured CRFs respectively.
Procedural Analysis of Choice Rules with Applications to Bounded Rationality
 American Economic Review
, 2011
"... I study how limited abilities to process information affect choice behavior. I model the decisionmaking process by an automaton, and measure the complexity of a specific choice rule by the minimal number of states an automaton implementing the rule uses to process information. I establish that any ..."
Abstract

Cited by 10 (1 self)
 Add to MetaCart
I study how limited abilities to process information affect choice behavior. I model the decisionmaking process by an automaton, and measure the complexity of a specific choice rule by the minimal number of states an automaton implementing the rule uses to process information. I establish that any choice rule that is less complicated than utility maximization displays framing effects. I then prove that choice rules that result from an optimal tradeoff between maximizing utility and minimizing complexity are historydependent satisficing procedures that display primacy and recency effects. (JEL D01, D03, D11, D83) In 1981, an internal study by American Airlines found that travel agents booked the first flight that appeared on their computer screen 53 percent of the time, and a flight that appeared somewhere on the first screen almost 92 percent of the time. This provided an incentive for American to manipulate the order in which flights were listed on American’s “Sabre ” computer reservation system, which dominated the market for electronic flight booking. Allegations by smaller airlines that American—and United Airlines which owned the second largest computer reservation system “Apollo”—engaged in such manipulations led the US federal government to regulate screen display in computer reservation systems. 1 This example, in which the order of the alternatives affects decision making, is by no means unique. In elections, for example, being listed first on the ballot increases the likelihood of winning office, mainly at the expense of candidates listed in the middle of the ballot. 2 In competitions, judges tend to award higher
An agent design method promoting separation between computation and coordination
 Proceedings of the 2004 ACM symposium on Applied computing
, 2004
"... The development of (internet) agents is often a tedious and errorprone task resulting in poorly reusable designs, since both the internal computation of the agent as well as the coordination support are developed in an ad hoc fashion. To improve the process of agentoriented software development, w ..."
Abstract

Cited by 9 (6 self)
 Add to MetaCart
The development of (internet) agents is often a tedious and errorprone task resulting in poorly reusable designs, since both the internal computation of the agent as well as the coordination support are developed in an ad hoc fashion. To improve the process of agentoriented software development, we propose an agent design method that imposes the separation of internal computation from coordination aspects. This method comprises two dimensions: a design formalism and an agent design process. As an illustration of the presented method, we present the design of an internet agent that is entitled to deploy a distributed service in a computer network, without breaking the consistency of that network. The presented design method has resulted in the development of ACF (Agent Composition Framework), a component framework to build flexible internet agents. We argue that the presented design method combined with this infrastructure can promote a modular and easy to manage approach to the design and development of internet agent applications.
Testing reactive systems with GAST
 In S. Gilmore, Trends in Functional Programming 4
, 2003
"... GAST is a fully automatic test system. Given a logical property, stated as a function, it is able to generate appropriate test values, to execute the tests, and to evaluate the results of these tests. ..."
Abstract

Cited by 9 (6 self)
 Add to MetaCart
GAST is a fully automatic test system. Given a logical property, stated as a function, it is able to generate appropriate test values, to execute the tests, and to evaluate the results of these tests.