Results 1 
3 of
3
Perception processing for general intelligence: Bridging the symbolic/subsymbolic gap
 In: Proceedings of AGI12, Lecture Notes in Computer Science
, 2012
"... Bridging the gap between symbolic and subsymbolic representations is a – perhaps the – key obstacle along the path from the present state of AI achievement to humanlevel artificial general intelligence. One approach to bridging this gap is hybridization – for instance, incorporation of a subsymboli ..."
Abstract

Cited by 2 (2 self)
 Add to MetaCart
Bridging the gap between symbolic and subsymbolic representations is a – perhaps the – key obstacle along the path from the present state of AI achievement to humanlevel artificial general intelligence. One approach to bridging this gap is hybridization – for instance, incorporation of a subsymbolic system and a symbolic system into a integrative cognitive architecture. Here we present a detailed design for an implementation of this approach, via integrating a version of the DeSTIN deep learning system into OpenCog, an integrative cognitive architecture including rich symbolic capabilities. This is a ”tight” integration, in which the symbolic and subsymbolic aspects exert detailed realtime influence on each others ’ operations. Part I of this paper has described in detail the revisions to DeSTIN needed to support this integration, which are mainly along the lines of making it more ”representationally transparent, ” so that its internal states are easier for OpenCog to understand. We describe an extension of the approach beyond vision to include multisensory integration, and perceptionaction integration. We discuss the potential use of this integrated system to control mobile robots, as exemplified via a thoughtexperiment involving eyehand coordination. 1
Bayesian Network Automata for Modelling Unbounded Structures
"... This paper proposes a framework which unifies graphical model theory and formal language theory through automata theory. Specifically, we propose Bayesian Network Automata (BNAs) as a formal framework for specifying graphical models of arbitrarily large structures, or equivalently, specifying probab ..."
Abstract
 Add to MetaCart
This paper proposes a framework which unifies graphical model theory and formal language theory through automata theory. Specifically, we propose Bayesian Network Automata (BNAs) as a formal framework for specifying graphical models of arbitrarily large structures, or equivalently, specifying probabilistic grammars in terms of graphical models. BNAs use a formal automaton to specify how to construct an arbitrarily large Bayesian Network by connecting multiple copies of a bounded Bayesian Network. Using a combination of results from graphical models and formal language theory, we show that, for a large class of automata, the complexity of inference with a BNA is bounded by the complexity of inference in the bounded Bayesian Network times the complexity of inference for the equivalent stochastic automaton. This illustrates that BNAs provide a useful framework for developing and analysing models and algorithms for structure prediction.
Improvements to Training an RNN parser
"... Many parsers learn sparse class distributions over trees to model natural language. Recursive Neural Networks (RNN) use much denser representations, yet can still achieve an Fscore of 92.06 % for right binarized sentences up to 15 words long. We examine an RNN model by comparing it with an abstract ..."
Abstract
 Add to MetaCart
Many parsers learn sparse class distributions over trees to model natural language. Recursive Neural Networks (RNN) use much denser representations, yet can still achieve an Fscore of 92.06 % for right binarized sentences up to 15 words long. We examine an RNN model by comparing it with an abstract generative probabilistic model using a Deep Belief Network (DBN). The DBN provides both an upwards and downwards pointing conditional model, drawing a connection between RNN and Charniak type parsers, while analytically predicting average scoring parameters in the RNN. In addition, we apply the RNN to longer sentences and develop two methods which, while having negligible effect on short sentence parsing, are able to improve the parsing FScore by 0.83 % on longer sentences.