Results 1 
8 of
8
Continuous Formal Systems: A Unifying Model in Language and Cognition
 In Proceedings of the IEEE Workshop on Architectures for Semiotic Modeling and Situation Analysis in Large Complex Systems
, 1995
"... this paper we outline the general characteristics of continuous formal systems ..."
Abstract

Cited by 15 (10 self)
 Add to MetaCart
this paper we outline the general characteristics of continuous formal systems
THE ELEMENTS OF CONSCIOUSNESS AND THEIR NEURODYNAMICAL CORRELATES
, 1996
"... The ‘hard problem’ is hard because of the special epistemological status of consciousness, which does not, however, preclude its scientific investigation. Data from phenomenologically trained observers can be combined with neurological investigations to establish the relation between experience and ..."
Abstract

Cited by 9 (7 self)
 Add to MetaCart
The ‘hard problem’ is hard because of the special epistemological status of consciousness, which does not, however, preclude its scientific investigation. Data from phenomenologically trained observers can be combined with neurological investigations to establish the relation between experience and neurodynamics. Although experience cannot be reduced to physical phenomena, parallel phenomenological and neurological analyses allow the structure of experience to be related to the structure of the brain. Such an analysis suggests a theoretical entity, an elementary unit of experience, the protophenomenon, which corresponds to an activity site (such as a synapse) in the brain. The structure of experience is determined by connections (e.g. dendrites) between these activity sites; the connections correspond to temporal patterns among the elementary units of experience, which can be expressed mathematically. This theoretical framework illuminates several issues, including degrees of consciousness, nonbiological consciousness, sensory inversions, unity of consciousness and the unconscious mind.
Protophenomena And Their Neurodynamical Correlates
, 1996
"... The `hard problem' is hard because of the special epistemological status of consciousness, which does not, however, preclude its scientific investigation. Data from phenomenologically trained observers can be combined with neurological investigations to establish the relation between experience and ..."
Abstract

Cited by 8 (7 self)
 Add to MetaCart
The `hard problem' is hard because of the special epistemological status of consciousness, which does not, however, preclude its scientific investigation. Data from phenomenologically trained observers can be combined with neurological investigations to establish the relation between experience and neurodynamics. Although experience cannot be reduced to physical phenomena, parallel phenomenological and neurological analyses allow the structure of experience to be related to the structure of the brain. Such an analysis suggests a theoretical entity, an elementary unit of experience, the protophenomenon, which corresponds to an activity site (such as a synapse) in the brain. The structure of experience is determined by connections (e.g. dendrites) between these activity sites; the connections correspond to temporal patterns among the elementary units of experience, which can be expressed mathematically. This theoretical framework illuminates several issues, including degrees of conscious...
SuperTuring or NonTuring? Extending the Concept of Computation
"... “Hypercomputation ” is often defined as transcending Turing computation in the sense of computing a larger class of functions than can Turing machines. While this possibility is important and interesting, this paper argues that there are many other important senses in which we may “transcend Turing ..."
Abstract

Cited by 8 (8 self)
 Add to MetaCart
“Hypercomputation ” is often defined as transcending Turing computation in the sense of computing a larger class of functions than can Turing machines. While this possibility is important and interesting, this paper argues that there are many other important senses in which we may “transcend Turing computation. ” Turing computation, like all models, exists in a frame of relevance, which underlies the assumptions on which it rests and the questions that it is suited to answer. Although appropriate in many circumstances, there are other important applications of the idea of computation for which this model is not relevant. Therefore we should supplement it with new models based on different assumptions and suited to answering different questions. In alternative frames of relevance, including natural computation and nanocomputation, the central issues include realtime response, continuity, indeterminacy, and parallelism. Once we understand computation in a broader sense, we can see new possibilities for using physical processes to achieve computational goals, which will increase in importance as we approach the limits of electronic binary logic. Key words: hypercomputation, ChurchTuring thesis, natural computation, theory of computation, model of computation, Turing computation,
The nature of computing — computing in nature
, 2005
"... My goal in this report is to recontextualize the concept of computation. I review the historical roots of ChurchTuring computation to show that the theory exists in a frame of relevance, which underlies the assumptions on which it rests and the questions it is suited to answer. Although this frame ..."
Abstract

Cited by 1 (1 self)
 Add to MetaCart
My goal in this report is to recontextualize the concept of computation. I review the historical roots of ChurchTuring computation to show that the theory exists in a frame of relevance, which underlies the assumptions on which it rests and the questions it is suited to answer. Although this frame of relevance is appropriate in many circumstances, there are many important applications of the idea of computation for which it is not relevant. These include natural computation (computation occurring in or inspired by nature), nanocomputation (computation based on nanoscale objects and processes), and computation based on quantum theory. As a consequence we need, not so much to abandon the ChurchTuring model of computation, as to supplement it with new models based on different assumptions and suited to answering different questions. Therefore I will discuss alternative frames of relevance more suited to the interrelated application areas of natural computation, emergent computation, and nanocomputation. Central issues include continuity, indeterminacy, and parallelism. Finally, I will argue that once we understand computation in a broader sense than the ChurchTuring model, we begin to see new possibilities for using natural processes to achieve our computational goals. These possibilities will increase in importance as we approach the limits of electronic binary logic as a basis for computation. They will also help us to understand computational processes in nature. * This report is based on an invited presentation at the workshop “Natural Processes & Models of
ON AN EIGENFLOW EQUATION AND ITS LIE ALGEBRAIC GENERALIZATION ∗
"... Abstract. This paper deals with a dynamical system of the form ˙ A = [[N, AT + A], A] + ν[[AT, A], A], where A is an n × n real matrix, N is a constant n × n real matrix, ν is a positive constant and [A, B] = AB − BA. In particular, the purpose of this paper is to establish a sorting behavior of th ..."
Abstract
 Add to MetaCart
Abstract. This paper deals with a dynamical system of the form ˙ A = [[N, AT + A], A] + ν[[AT, A], A], where A is an n × n real matrix, N is a constant n × n real matrix, ν is a positive constant and [A, B] = AB − BA. In particular, the purpose of this paper is to establish a sorting behavior of the dynamical system and to represent it in a general Lie algebraic setting. Moreover, some applications of the dynamical system are presented.
Survey of Stochastic Computing
"... Stochastic computing (SC) was proposed in the 1960s as a lowcost alternative to conventional binary computing. It is unique in that it represents and processes information in the form of digitized probabilities. SC employs very lowcomplexity arithmetic units which was a primary design concern in t ..."
Abstract
 Add to MetaCart
Stochastic computing (SC) was proposed in the 1960s as a lowcost alternative to conventional binary computing. It is unique in that it represents and processes information in the form of digitized probabilities. SC employs very lowcomplexity arithmetic units which was a primary design concern in the past. Despite this advantage and also its inherent error tolerance, SC was seen as impractical because of very long computation times and relatively low accuracy. However, current technology trends tend to increase uncertainty in circuit behavior, and imply a need to better understand, and perhaps exploit, probability in computation. This paper surveys SC from a modern perspective where the small size, error resilience, and probabilistic features of SC may compete successfully with conventional methodologies in certain applications. First, we survey the literature and review the key concepts of stochastic number representation and circuit structure. We then describe the design of SCbased circuits and evaluate their advantages and disadvantages. Finally, we give examples of the potential applications of SC, and discuss some practical problems that are yet to be solved.
Parallel Computer Systems Based on Numerical Integrations
"... This paper deals with continuous system simulation. The systems can be described by system of differential equations or block diagram. Differential equations are usually solved by numerical methods that are integrated into simulation software such as Matlab, Maple or TKSL. Taylor series method has b ..."
Abstract
 Add to MetaCart
This paper deals with continuous system simulation. The systems can be described by system of differential equations or block diagram. Differential equations are usually solved by numerical methods that are integrated into simulation software such as Matlab, Maple or TKSL. Taylor series method has been used for numerical solutions of differential equations. The presented method has been proved to be both very accurate and fast and also procesed in parallel systems. The aim of the thesis is to design, implement and compare a few versions of the parallel system.