Results 1  10
of
40
KnowledgeBased Artificial Neural Networks
, 1994
"... Hybrid learning methods use theoretical knowledge of a domain and a set of classified examples to develop a method for accurately classifying examples not seen during training. The challenge of hybrid learning systems is to use the information provided by one source of information to offset informat ..."
Abstract

Cited by 151 (13 self)
 Add to MetaCart
Hybrid learning methods use theoretical knowledge of a domain and a set of classified examples to develop a method for accurately classifying examples not seen during training. The challenge of hybrid learning systems is to use the information provided by one source of information to offset information missing from the other source. By so doing, a hybrid learning system should learn more effectively than systems that use only one of the information sources. KBANN(KnowledgeBased Artificial Neural Networks) is a hybrid learning system built on top of connectionist learning techniques. It maps problemspecific "domain theories", represented in propositional logic, into neural networks and then refines this reformulated knowledge using backpropagation. KBANN is evaluated by extensive empirical tests on two problems from molecular biology. Among other results, these tests show that the networks created by KBANN generalize better than a wide variety of learning systems, as well as several t...
Beyond Turing Machines
"... In this paper we describe and analyze models of problem solving and computation going beyond Turing Machines. Three principles of extending the Turing Machine's expressiveness are identified, namely, by interaction, evolution and infinity. Several models utilizing the above principles are pr ..."
Abstract

Cited by 33 (4 self)
 Add to MetaCart
In this paper we describe and analyze models of problem solving and computation going beyond Turing Machines. Three principles of extending the Turing Machine's expressiveness are identified, namely, by interaction, evolution and infinity. Several models utilizing the above principles are presented. Other
The Neural Network Pushdown Automaton: Model, Stack and Learning Simulations
, 1993
"... In order for neural networks to learn complex languages or grammars, they must have sufficient computational power or resources to recognize or generate such languages. Though many approaches to effectively utilizing the computational power of neural networks have been discussed, an obvious one is t ..."
Abstract

Cited by 17 (2 self)
 Add to MetaCart
In order for neural networks to learn complex languages or grammars, they must have sufficient computational power or resources to recognize or generate such languages. Though many approaches to effectively utilizing the computational power of neural networks have been discussed, an obvious one is to couple a recurrent neural network with an external stack memory in effect creating a neural network pushdown automata (NNPDA). This NNPDA generalizes the concept of a recurrent network so that the network becomes a more complex computing structure. This paper discusses in detail a NNPDA its construction, how it can be trained and how useful symbolic information can be extracted from the trained network. To effectively couple the external stack to the neural network, an optimization method is developed which uses an error function that connects the learning of the state automaton of the neural network to the learning of the operation of the external stack: push, pop, and nooperation. To minimize the error function using gradient descent learning, an analog stack is designed such that the action and storage of information in the stack are continuous. One interpretation of a continuous stack is the probabilistic storage of and action on data. After training on sample strings of an unknown source grammar, a quantization procedure extracts from the analog stack and neural network a discrete pushdown automata (PDA). Simulations show that in learning deterministic contextfree grammars the balanced parenthesis language, 1 n 0 n, and the deterministic Palindrome the extracted PDA is correct in the sense that it can correctly recognize unseen strings of arbitrary length. In addition, the extracted PDAs can be shown to be identical or equivalent to the PDAs of the source grammars which were used to generate the training strings.
Boolean Dynamics with Random Couplings
, 2002
"... This paper reviews a class of generic dissipative dynamical systems called NK models. In these models, the dynamics of N elements, defined as Boolean variables, develop step by step, clocked by a discrete time variable. Each of the N Boolean elements at a given time is given a value which depends ..."
Abstract

Cited by 12 (0 self)
 Add to MetaCart
This paper reviews a class of generic dissipative dynamical systems called NK models. In these models, the dynamics of N elements, defined as Boolean variables, develop step by step, clocked by a discrete time variable. Each of the N Boolean elements at a given time is given a value which depends upon K elements in the previous time step. We review the work of many authors on the behavior of the models, looking particularly at the structure and lengths of their cycles, the sizes of their basins of attraction, and the flow of information through the systems. In the limit of infinite N, there is a phase transition between a chaotic and an ordered phase, with a critical phase in between. We argue that the behavior of this system depends significantly on the topology of the network connections. If the elements are placed upon a lattice with dimension d, the system shows correlations related to the standard percolation or directed percolation phase transition on such a lattice. On the other hand, a very different behavior is seen in the Kauffman net in which all spins are equally likely to be coupled to a given spin. In this situation, coupling loops are mostly suppressed, and the behavior of the system is much more like that of a mean field theory. We also describe possible applications of the models to, for example, genetic networks, cell differentiation, evolution, democracy in social systems and neural networks.
Computational Modeling vs. Computational Explanation: Is Everything a Turing Machine, and Does It Matter to the Philosophy of Mind?” Australasian
 Philosophy of Science. Piccinini, G. (forthcoming b). “Computation without Representation,” Philosophical
, 2007
"... According to pancomputationalism, everything is a computing system. In this paper, I distinguish between different varieties of pancomputationalism. I find that although some varieties are more plausible than others, only the strongest variety is relevant to the philosophy of mind, but only the most ..."
Abstract

Cited by 8 (4 self)
 Add to MetaCart
According to pancomputationalism, everything is a computing system. In this paper, I distinguish between different varieties of pancomputationalism. I find that although some varieties are more plausible than others, only the strongest variety is relevant to the philosophy of mind, but only the most trivial varieties are true. As a side effect of this exercise, I offer a clarified distinction between computational modelling and computational explanation. I. Pancomputationalism and the Computational Theory of Mind The main target of this paper is pancomputationalism, according to which everything is a computing system. I have encountered two peculiar responses to pancomputationalism: some philosophers find it obviously false, too silly to be worth refuting; others find it obviously true, too trivial to require a defence. Neither camp sees the need for this paper. But neither camp seems aware of the other camp. The existence of both camps, together with continuing appeals to pancomputationalism in the literature, compel me to analyse the matter more closely. In this paper, I distinguish between different varieties of pancomputationalism. I find that although some are more plausible than others, only the strongest variety is relevant to the philosophy of mind, but only the most trivial varieties are true. As a side effect of this exercise, I offer a clarified distinction between computational modelling and computational explanation. The canonical formulation of pancomputationalism is due to Hilary Putnam: ‘everything is a Probabilistic Automaton under some Description’ [Putnam 1999: 31; ‘probabilistic automaton ’ is Putnam’s term for
Learning to See Analogies: a Connectionist Exploration
, 1997
"... This dissertation explores the integration of learning and analogymaking through the development of a computer program, called Analogator, that learns to make analogies by example. By "seeing" many different analogy problems, along with possible solutions, Analogator gradually develops an ..."
Abstract

Cited by 7 (2 self)
 Add to MetaCart
This dissertation explores the integration of learning and analogymaking through the development of a computer program, called Analogator, that learns to make analogies by example. By "seeing" many different analogy problems, along with possible solutions, Analogator gradually develops an ability to make new analogies. That is, it learns to make analogies by analogy. This approach stands in contrast to most existing research on analogymaking, in which typically the a priori existence of analogical mechanisms within a model is assumed. The present research extends standard connectionist methodologies by developing a specialized associative training procedure for a recurrent network architecture. The network is trained to divide input scenes (or situations) into appropriate figure and vi ground components. Seeing one scene in terms of a particular figure and ground provides the context for seeing another in an analogous fashion. After training, the model is able to make new analogies...
2012 “From dissonance to resonance: Cognitive interdependence in quantitative finance
 Economy and Society
"... This study explores the elusive social dimension of quantitative finance. We conducted three years of observations in the derivatives trading room of a major investment bank. We found that traders use models to translate stock prices into estimates of what their rivals think. Traders use these estim ..."
Abstract

Cited by 4 (1 self)
 Add to MetaCart
This study explores the elusive social dimension of quantitative finance. We conducted three years of observations in the derivatives trading room of a major investment bank. We found that traders use models to translate stock prices into estimates of what their rivals think. Traders use these estimates to look out for possible errors in their own models. We found that this practice, reflexive modeling, enhances returns by turning prices into a vehicle for distributed cognition. But it also induces a dangerous form of cognitive interdependence: when enough traders overlook a key issue, their positions give misplaced reassurance to those traders that think similarly, disrupting their reflexive processes. In cases lacking diversity, dissonance thus gives way to resonance. Our analysis demonstrates how practices born in caution can lead to overconfidence and collective failure. We contribute to economic sociology by developing a sociotechnical account that grapples with the new forms of sociality introduced by financial models – dissembedded yet entangled; anonymous yet collective; impersonal yet, nevertheless, emphatically social.
Mobile Robot Local Navigation with a Polar Neural Map
, 1998
"... B Biographical Sketch .................................135 viii List of Tables Table 1 Algorithm for path construction. .......................41 Table 2 Algorithm for determining the next movement direction. . . . . ....41 Table 3 Computational complexity of the system. . . ...............99 ..."
Abstract

Cited by 4 (3 self)
 Add to MetaCart
B Biographical Sketch .................................135 viii List of Tables Table 1 Algorithm for path construction. .......................41 Table 2 Algorithm for determining the next movement direction. . . . . ....41 Table 3 Computational complexity of the system. . . ...............99 ix List of Figures Figure 1 Several mobile robots of the class we consider in the thesis. .... 9 Figure 2 The concept of a neural map. . .......................20 Figure 3 Selforganization at the neural mapping (9) level. More neurons are assigned to the `important' areas of X. The mapping is shown by superimposing the neural field F on the signal space X. . . . . . 24 Figure 4 The basic nonlinear processing unit or neuron. . . ...........25 Figure 5 The architecture of the subsystem for path planning with neural maps. ...................................... 31 Figure 6 Different network topologies and connections for 2dimensional uniform coverage. . ..............................32 Figure 7 Connection weight as a function of distance. . . . ...........33 Figure 8 The nonlinear activation function. . . . ...................35 Figure 9 The target and obstacle configurations (left) and the contours of the equilibrium surface (right). . . . . ...................38 Figure 10 Network equilibrium state of a 50 50 neural map for a single target. ......................................39 Figure 11 The target and obstacle configurations (left) and the contours of the equilibrium surface (right). . . . . ...................39 Figure 12 Network equilibrium state of a 50 50 neural map for multiple targets. ......................................40 Figure 13 Update rasters on a 2dimensional lattice. . ...............46 Fi...
Learning Geometric Transformations with Clifford Neurons
, 2000
"... In this paper we propose a new type of neuron developed in the framework of Clifford algebra. It is shown how this novel Clifford neuron covers complex and quaternionic neurons and that it can compute orthogonal transformations very efficiently. The introduced framework can also be used for neural c ..."
Abstract

Cited by 2 (2 self)
 Add to MetaCart
In this paper we propose a new type of neuron developed in the framework of Clifford algebra. It is shown how this novel Clifford neuron covers complex and quaternionic neurons and that it can compute orthogonal transformations very efficiently. The introduced framework can also be used for neural computation of nonlinear geometric transformations which makes it very promising for applications. As an example we develop a Clifford neuron that computes the crossratio via the corresponding Möbius transformation. Experimental results for the proposed novel neural models are reported.