Results 1  10
of
135
The induction of dynamical recognizers
 Machine Learning
, 1991
"... A higher order recurrent neural network architecture learns to recognize and generate languages after being "trained " on categorized exemplars. Studying these networks from the perspective of dynamical systems yields two interesting discoveries: First, a longitudinal examination of the le ..."
Abstract

Cited by 225 (14 self)
 Add to MetaCart
(Show Context)
A higher order recurrent neural network architecture learns to recognize and generate languages after being "trained " on categorized exemplars. Studying these networks from the perspective of dynamical systems yields two interesting discoveries: First, a longitudinal examination of the learning process illustrates a new form of mechanical inference: Induction by phase transition. A small weight adjustment causes a "bifurcation" in the limit behavior of the network. This phase transition corresponds to the onset of the network’s capacity for generalizing to arbitrarylength strings. Second, a study of the automata resulting from the acquisition of previously published training sets indicates that while the architecture is not guaranteed to find a minimal finite automaton consistent with the given exemplars, which is an NPHard problem, the architecture does appear capable of generating nonregular languages by exploiting fractal and chaotic dynamics. I end the paper with a hypothesis relating linguistic generative capacity to the behavioral regimes of nonlinear dynamical systems.
Models of Computation  Exploring the Power of Computing
"... Theoretical computer science treats any computational subject for which a good model can be created. Research on formal models of computation was initiated in the 1930s and 1940s by Turing, Post, Kleene, Church, and others. In the 1950s and 1960s programming languages, language translators, and oper ..."
Abstract

Cited by 87 (6 self)
 Add to MetaCart
Theoretical computer science treats any computational subject for which a good model can be created. Research on formal models of computation was initiated in the 1930s and 1940s by Turing, Post, Kleene, Church, and others. In the 1950s and 1960s programming languages, language translators, and operating systems were under development and therefore became both the subject and basis for a great deal of theoretical work. The power of computers of this period was limited by slow processors and small amounts of memory, and thus theories (models, algorithms, and analysis) were developed to explore the efficient use of computers as well as the inherent complexity of problems. The former subject is known today as algorithms and data structures, the latter computational complexity. The focus of theoretical computer scientists in the 1960s on languages is reflected in the first textbook on the subject, Formal Languages and Their Relation to Automata by John Hopcroft and Jeffrey Ullman. This influential book led to the creation of many languagecentered theoretical computer science courses; many introductory theory courses today continue to reflect the content of this book and the interests of theoreticians of the 1960s and early 1970s. Although
Evolution, Neural Networks, Games, and Intelligence
"... Intelligence pertains to the ability to make appropriate decisions in light of specific goals and to adapt behavior to meet those goals in a range of environments. Mathematical games provide a framework for studying intelligent behavior in models of realworld settings or restricted domains. The beh ..."
Abstract

Cited by 65 (1 self)
 Add to MetaCart
Intelligence pertains to the ability to make appropriate decisions in light of specific goals and to adapt behavior to meet those goals in a range of environments. Mathematical games provide a framework for studying intelligent behavior in models of realworld settings or restricted domains. The behavior of alternative strategies in these games is defined by each individual’s stimulusresponse mapping. Limiting these behaviors to linear functions of the environmental conditions renders the results to be little more than a façade: Effective decision making in any complex environment almost always requires nonlinear stimulusresponse mappings. The obstacle then comes in choosing the appropriate representation and learning algorithm. Neural networks and evolutionary algorithms provide useful means for addressing these issues. This paper describes efforts to hybridize neural and evolutionary computation to learn appropriate strategies in zeroand nonzerosum games, including the iterated prisoner’s dilemma, tictactoe, and checkers. With respect to checkers, the evolutionary algorithm was able to discover a neural network that can be used to play at a nearexpert level without injecting expert knowledge about how to play the game. The implications of evolutionary learning with respect to machine intelligence are also discussed. It is argued that evolution provides the framework for explaining naturally occurring intelligent entities and can be used to design machines that are also capable of intelligent behavior.
A Formal Model and Specification Language for Procedure Calling Conventions
 In Conference Record of the 22nd Annual ACM Symposium on Principles of Programming Languages
, 1994
"... Procedure calling conventions are used to provide uniform procedure call interfaces. Applications, such as compilers and debuggers, which generate, or process procedures at the machinelanguage abstraction level require knowledge of the calling convention. In this paper, we develop a formal model fo ..."
Abstract

Cited by 35 (9 self)
 Add to MetaCart
(Show Context)
Procedure calling conventions are used to provide uniform procedure call interfaces. Applications, such as compilers and debuggers, which generate, or process procedures at the machinelanguage abstraction level require knowledge of the calling convention. In this paper, we develop a formal model for procedure calling conventions called PFSA's. Using this model, we are able to ensure several completeness and consistency properties of calling conventions. Currently, applications that manipulate procedures implement conventions in an adhoc manner. The resulting code is complicated with details, difficult to maintain, and often riddled with errors. To alleviate the situation, we introduce a calling convention specification language, called CCL. The combination of CCL and PFSA's facilitates the accurate specification of conventions that can be shown to be both consistent and complete. 1 Introduction Procedures, or functions, in programming languages work in concert to implement the in...
A Taxonomy for Spatiotemporal Connectionist Networks Revisited: The Unsupervised Case
 Neural Computation
, 2003
"... Spatiotemporal connectionist networks (STCN's) comprise an important class of neural models that can deal with patterns distributed both in time and space. In this paper, we widen the application domain of the taxonomy for supervised STCN's recently proposed by Kremer (2001) to the unsuper ..."
Abstract

Cited by 27 (1 self)
 Add to MetaCart
Spatiotemporal connectionist networks (STCN's) comprise an important class of neural models that can deal with patterns distributed both in time and space. In this paper, we widen the application domain of the taxonomy for supervised STCN's recently proposed by Kremer (2001) to the unsupervised case. This is possible through a reinterpretation of the state vector as a vector of latent (hidden) variables, as proposed by Meinicke (2000). The goal of this generalized taxonomy is then to provide a nonlinear generative framework for describing unsupervised spatiotemporal networks, making it easier to compare and contrast their representational and operational characteristics. Computational properties, representational issues and learning are also discussed and a number of references to the relevant source publications are provided. It is argued that the proposed approach is simple and more powerful than the previous attempts, from a descriptive and predictive viewpoint. We also discuss the relation of this taxonomy with automata theory and state space modeling, and suggest directions for further work.
Procedural Analysis of Choice Rules with Applications to Bounded Rationality
 American Economic Review
, 2011
"... I study how limited abilities to process information affect choice behavior. I model the decisionmaking process by an automaton, and measure the complexity of a specific choice rule by the minimal number of states an automaton implementing the rule uses to process information. I establish that any ..."
Abstract

Cited by 22 (2 self)
 Add to MetaCart
I study how limited abilities to process information affect choice behavior. I model the decisionmaking process by an automaton, and measure the complexity of a specific choice rule by the minimal number of states an automaton implementing the rule uses to process information. I establish that any choice rule that is less complicated than utility maximization displays framing effects. I then prove that choice rules that result from an optimal tradeoff between maximizing utility and minimizing complexity are historydependent satisficing procedures that display primacy and recency effects. (JEL D01, D03, D11, D83) In 1981, an internal study by American Airlines found that travel agents booked the first flight that appeared on their computer screen 53 percent of the time, and a flight that appeared somewhere on the first screen almost 92 percent of the time. This provided an incentive for American to manipulate the order in which flights were listed on American’s “Sabre ” computer reservation system, which dominated the market for electronic flight booking. Allegations by smaller airlines that American—and United Airlines which owned the second largest computer reservation system “Apollo”—engaged in such manipulations led the US federal government to regulate screen display in computer reservation systems. 1 This example, in which the order of the alternatives affects decision making, is by no means unique. In elections, for example, being listed first on the ballot increases the likelihood of winning office, mainly at the expense of candidates listed in the middle of the ballot. 2 In competitions, judges tend to award higher
TLA in Pictures
, 1995
"... Predicateaction diagrams, which are similar to standard statetransition diagrams, are precisely defined as formulas of TLA (the Temporal Logic of Actions). We explain how these diagrams can be used to describe aspects of a specificationand those descriptions then proved correct even when the ..."
Abstract

Cited by 22 (0 self)
 Add to MetaCart
Predicateaction diagrams, which are similar to standard statetransition diagrams, are precisely defined as formulas of TLA (the Temporal Logic of Actions). We explain how these diagrams can be used to describe aspects of a specificationand those descriptions then proved correct even when the complete specification cannot be written as a diagram. We also use the diagrams to illustrate proofs. KeywordsConcurrency, specification, statetransition diagrams, temporal logic. I. Introduction Pictures aid understanding. A simple flowchart is easier to understand than the equivalent programminglanguage text. However, complex pictures are confusing. A large, spaghettilike flowchart is harder to understand than a properly structured program text. Pictures are inadequate for specifying complex systems, but they can help us understand particular aspects of a system. For a picture to provide more than an informal comment, there must be a formal connection between the complete specific...
Computing the Discrete Fréchet Distance in Subquadratic Time
, 2012
"... The Fréchet distance is a similarity measure between two curves ..."
Abstract

Cited by 19 (4 self)
 Add to MetaCart
(Show Context)
The Fréchet distance is a similarity measure between two curves
Inference and Analysis of Formal Models of Botnet Command and Control Protocols
, 2010
"... We propose a novel approach to infer protocol state machines in the realistic highlatency network setting, and apply it to the analysis of botnet Command and Control (C&C) protocols. Our proposed techniques enable an order of magnitude reduction in the number of queries and time needed to learn ..."
Abstract

Cited by 18 (2 self)
 Add to MetaCart
We propose a novel approach to infer protocol state machines in the realistic highlatency network setting, and apply it to the analysis of botnet Command and Control (C&C) protocols. Our proposed techniques enable an order of magnitude reduction in the number of queries and time needed to learn a botnet C&C protocol compared to classic algorithms (from days to hours for inferring the MegaD C&C protocol). We also show that the computed protocol state machines enable formal analysis for botnet defense, including finding the weakest links in a protocol, uncovering protocol design flaws, inferring the existence of unobservable communication backchannels among botnet servers, and finding deviations of protocol implementations which can be used for fingerprinting. We validate our technique by inferring the protocol statemachine from Postfix’s SMTP implementation and comparing the inferred statemachine to the SMTP standard. Further, our experimental results offer new insights into MegaD’s C&C, showing our technique can be used as a powerful tool for defense against botnets.
Causal commutative arrows and their optimization
 In Proc. International Conference on Functional Programming, ICFP ’09
, 2009
"... Arrows are a popular form of abstract computation. Being more general than monads, they are more broadly applicable, and in particular are a good abstraction for signal processing and dataflow computations. Most notably, arrows form the basis for a domain specific language called Yampa, which has be ..."
Abstract

Cited by 17 (1 self)
 Add to MetaCart
(Show Context)
Arrows are a popular form of abstract computation. Being more general than monads, they are more broadly applicable, and in particular are a good abstraction for signal processing and dataflow computations. Most notably, arrows form the basis for a domain specific language called Yampa, which has been used in a variety of concrete applications, including animation, robotics, sound synthesis, control systems, and graphical user interfaces. computations captured by Yampa. Unfortunately, arrows are not concrete enough to do this with precision. To remedy this situation we introduce the concept of commutative arrows that capture a kind of noninterference property of concurrent computations. We also add an init operator, and identify a crucial law that captures the causal nature of arrow effects. We call the resulting computational model causal commutative arrows. To study this class of computations in more detail, we define an extension to the simply typed lambda calculus called causal commutative arrows (CCA), and study its properties. Our key contribution is the identification of a normal form for CCA called causal commutative normal form (CCNF). By defining a normalization procedure we have developed an optimization strategy that yields dramatic improvements in performance over conventional implementations of arrows. We have implemented this technique in Haskell, and conducted benchmarks that validate the effectiveness of our approach. When combined with stream fusion, the overall methodology can result in speedups of greater than two orders of magnitude.