Results 1  10
of
231
The induction of dynamical recognizers
 Machine Learning
, 1991
"... A higher order recurrent neural network architecture learns to recognize and generate languages after being "trained " on categorized exemplars. Studying these networks from the perspective of dynamical systems yields two interesting discoveries: First, a longitudinal examination of the learning pro ..."
Abstract

Cited by 214 (16 self)
 Add to MetaCart
A higher order recurrent neural network architecture learns to recognize and generate languages after being "trained " on categorized exemplars. Studying these networks from the perspective of dynamical systems yields two interesting discoveries: First, a longitudinal examination of the learning process illustrates a new form of mechanical inference: Induction by phase transition. A small weight adjustment causes a "bifurcation" in the limit behavior of the network. This phase transition corresponds to the onset of the network’s capacity for generalizing to arbitrarylength strings. Second, a study of the automata resulting from the acquisition of previously published training sets indicates that while the architecture is not guaranteed to find a minimal finite automaton consistent with the given exemplars, which is an NPHard problem, the architecture does appear capable of generating nonregular languages by exploiting fractal and chaotic dynamics. I end the paper with a hypothesis relating linguistic generative capacity to the behavioral regimes of nonlinear dynamical systems.
Using statistics in lexical analysis
 Lexical Acquisition: Exploiting OnLine Resources to Build a Lexicon
, 1991
"... The computational tools available for studying machinereadable corpora are at present still rather primitive. In the more advanced lexicographic organizations, there are concordancing programs (see figure below), which are basically KWIC (key word in context (Aho et al., 1988, p. 122), (Salton, 198 ..."
Abstract

Cited by 144 (3 self)
 Add to MetaCart
The computational tools available for studying machinereadable corpora are at present still rather primitive. In the more advanced lexicographic organizations, there are concordancing programs (see figure below), which are basically KWIC (key word in context (Aho et al., 1988, p. 122), (Salton, 1989, p. 384)) indexes with additional features such as the ability to extend the context, sort leftwards as well as
Computation at the onset of chaos
 The Santa Fe Institute, Westview
, 1988
"... Computation at levels beyond storage and transmission of information appears in physical systems at phase transitions. We investigate this phenomenon using minimal computational models of dynamical systems that undergo a transition to chaos as a function of a nonlinearity parameter. For perioddoubl ..."
Abstract

Cited by 83 (14 self)
 Add to MetaCart
Computation at levels beyond storage and transmission of information appears in physical systems at phase transitions. We investigate this phenomenon using minimal computational models of dynamical systems that undergo a transition to chaos as a function of a nonlinearity parameter. For perioddoubling and bandmerging cascades, we derive expressions for the entropy, the interdependence ofmachine complexity and entropy, and the latent complexity of the transition to chaos. At the transition deterministic finite automaton models diverge in size. Although there is no regular or contextfree Chomsky grammar in this case, we give finite descriptions at the higher computational level of contextfree Lindenmayer systems. We construct a restricted indexed contextfree grammar and its associated oneway nondeterministic nested stack automaton for the cascade limit language. This analysis of a family of dynamical systems suggests a complexity theoretic description of phase transitions based on the informational diversity and computational complexity of observed data that is independent of particular system control parameters. The approach gives a much more refined picture of the architecture of critical states than is available via
Statistical methods and linguistics
 THE BALANCING ACT: COMBINING SYMBOLIC AND STATISTICAL APPROACHES TO LANGUAGE
, 1996
"... In the space of the last ten years, statistical methods have gone from being virtually unknown in computational linguistics to being a fundamental given. In 1996, no one can profess to be a computational linguist without a passing knowledge of statistical methods. HMM's are as de rigeur as LR tables ..."
Abstract

Cited by 79 (0 self)
 Add to MetaCart
In the space of the last ten years, statistical methods have gone from being virtually unknown in computational linguistics to being a fundamental given. In 1996, no one can profess to be a computational linguist without a passing knowledge of statistical methods. HMM's are as de rigeur as LR tables, and anyone who cannot at least use the terminology persuasively risks being mistaken for kitchen help at the ACL banquet. More seriously, statistical techniques have brought signi cant advances in broadcoverage language processing. Statistical methods have made real progress possible on a number of issues that had previously stymied attempts to liberate systems from toy domains � issues that include disambiguation, error correction, and the induction of the sheer volume of information requisite for handling unrestricted text. And the sense of progress has generated a great deal of enthusiasm for statistical methods in computational linguistics. However, this enthusiasm has not been catching in linguistics proper. It is always dangerous to generalize about linguists, but I think it is fair to say
The Use of Positional Information in the Modeling of Plants
, 2001
"... We integrate into plant models three elements of plant representation identified as important by artists: posture (manifested in curved stems and elongated leaves), gradual variation of features, and the progression of the drawing process from overall silhouette to local details. The resulting algor ..."
Abstract

Cited by 79 (13 self)
 Add to MetaCart
We integrate into plant models three elements of plant representation identified as important by artists: posture (manifested in curved stems and elongated leaves), gradual variation of features, and the progression of the drawing process from overall silhouette to local details. The resulting algorithms increase the visual realism of plant models by offering an intuitive control over plant form and supporting an interactive modeling process. The algorithms are united by the concept of expressing local attributes of plant architecture as functions of their location along the stems.
The reactable: Exploring the synergy between live music performance and tabletop tangible interfaces
 In Proceedings of the first international conference on ”Tangible and Embedded Interaction”, Baton
, 2007
"... In recent years we have seen a proliferation of musical tables. Believing that this is not just the result of a tabletop trend, in this paper we first discuss several of the reasons for which live music performance and HCI in general, and musical instruments and tabletop interfaces in particular, ca ..."
Abstract

Cited by 79 (7 self)
 Add to MetaCart
In recent years we have seen a proliferation of musical tables. Believing that this is not just the result of a tabletop trend, in this paper we first discuss several of the reasons for which live music performance and HCI in general, and musical instruments and tabletop interfaces in particular, can lead to a fertile twoway crosspollination that can equally benefit both fields. After that, we present the reacTable, a musical instrument based on a tabletop interface that exemplifies several of these potential achievements. Author Keywords Tangible interfaces, tabletop interfaces, musical instrument, musical performance, design, interaction techniques. ACM Classification Keywords H.5.2 [User Interfaces]: interaction styles, input devices and strategies J.5: [Arts and Humanities]: performing arts.
A Probabilistic Earley Parser as a Psycholinguistic Model
 IN PROCEEDINGS OF NAACL
, 2001
"... In human sentence processing, cognitive load can be defined many ways. This report considers a definition of cognitive load in terms of the total probability of structural options that have been disconfirmed at some point in a sentence: the surprisal of word w i given its prefix w 0...i1 on a phras ..."
Abstract

Cited by 64 (5 self)
 Add to MetaCart
In human sentence processing, cognitive load can be defined many ways. This report considers a definition of cognitive load in terms of the total probability of structural options that have been disconfirmed at some point in a sentence: the surprisal of word w i given its prefix w 0...i1 on a phrasestructural language model. These loads can be efficiently calculated using a probabilistic Earley parser (Stolcke, 1995) which is interpreted as generating predictions about reading time on a wordbyword basis. Under grammatical assumptions supported by corpusfrequency data, the operation of Stolcke's probabilistic Earley parser correctly predicts processing phenomena associated with garden path structural ambiguity and with the subject/object relative asymmetry.
Models of Computation  Exploring the Power of Computing
"... Theoretical computer science treats any computational subject for which a good model can be created. Research on formal models of computation was initiated in the 1930s and 1940s by Turing, Post, Kleene, Church, and others. In the 1950s and 1960s programming languages, language translators, and oper ..."
Abstract

Cited by 57 (7 self)
 Add to MetaCart
Theoretical computer science treats any computational subject for which a good model can be created. Research on formal models of computation was initiated in the 1930s and 1940s by Turing, Post, Kleene, Church, and others. In the 1950s and 1960s programming languages, language translators, and operating systems were under development and therefore became both the subject and basis for a great deal of theoretical work. The power of computers of this period was limited by slow processors and small amounts of memory, and thus theories (models, algorithms, and analysis) were developed to explore the efficient use of computers as well as the inherent complexity of problems. The former subject is known today as algorithms and data structures, the latter computational complexity. The focus of theoretical computer scientists in the 1960s on languages is reflected in the first textbook on the subject, Formal Languages and Their Relation to Automata by John Hopcroft and Jeffrey Ullman. This influential book led to the creation of many languagecentered theoretical computer science courses; many introductory theory courses today continue to reflect the content of this book and the interests of theoreticians of the 1960s and early 1970s. Although
A Convergent Gambling Estimate of the Entropy of English
 IEEE Transactions on Information Theory
, 1978
"... AbstmctIn his original paper on the subject, Shannon found upper which follow using the boundedness and continuity of and lower bounds for the entropy of printed English based on the number h(p) =p logp (1p) log (1p). In addition, if English of trials required for a subject to guess subsequent ..."
Abstract

Cited by 53 (1 self)
 Add to MetaCart
AbstmctIn his original paper on the subject, Shannon found upper which follow using the boundedness and continuity of and lower bounds for the entropy of printed English based on the number h(p) =p logp (1p) log (1p). In addition, if English of trials required for a subject to guess subsequent symbols in a given text. is an ergodic process, then the ShamronMcMillanBreiThe guessing approach precludes asymptotic consistency of either the upper or lower bounds except for degenerate ergodic processes. Shannon’s man theorem states technique of guessing the next symbol is altered by having the subject place sequential bets on the next symbol of text. lf S,, denotes the subject’s capital after n bets at 27 for 1 odds, and lf it is assumed thati log,,p(X,;..,X&H(X) a.e. (3) the subject hnows the underlying prpbabillty distribution for the process X, then the entropy estimate ls H,(X) =(l(l/n) log,, S,) log, 27 If printed English is indeed an ergodic process, then for bits/symbol. If the subject does npt hnow the true probabllty distribution sufficiently large n a good estimate of H(X) can be for the stochastic process, then Z&(X! ls an asymptotic upper bound for obtained from knowledge of p(e) on a randomly drawn the true entropy. ff X is stationary, EH,,(X)+H(X), H(X) bell the true