Results 1  10
of
56
Horizons for the enactive mind: Values, social interaction, and play
, 2007
"... What is the enactive approach to cognition? Over the last 15 years this banner has grown to become a respectable alternative to traditional frameworks in cognitive science. It is at the same time a label with different interpretations and upon which different doubts have been cast. This paper elabor ..."
Abstract

Cited by 28 (11 self)
 Add to MetaCart
What is the enactive approach to cognition? Over the last 15 years this banner has grown to become a respectable alternative to traditional frameworks in cognitive science. It is at the same time a label with different interpretations and upon which different doubts have been cast. This paper elaborates on the core ideas that define the enactive approach and their implications: autonomy, sensemaking, emergence, embodiment, and experience. These are coherent, radical and very powerful concepts that establish clear methodological guidelines for research. The paper also looks at the problems that arise from taking these ideas seriously. The enactive approach has plenty of room for elaboration in many different areas and many challenges to respond to. In particular, we concentrate on the problems surrounding several theories of valueappraisal and valuegeneration. The enactive view takes the task of understanding meaning and value very seriously and elaborates a proper scientific alternative to reductionist attempts to tackle these issues by functional localization. Another area where the enactive framework can make a significant contribution is social interaction and
Comparing mathematical provers
 In Mathematical Knowledge Management, 2nd Int’l Conf., Proceedings
, 2003
"... Abstract. We compare fifteen systems for the formalizations of mathematics with the computer. We present several tables that list various properties of these programs. The three main dimensions on which we compare these systems are: the size of their library, the strength of their logic and their le ..."
Abstract

Cited by 23 (0 self)
 Add to MetaCart
Abstract. We compare fifteen systems for the formalizations of mathematics with the computer. We present several tables that list various properties of these programs. The three main dimensions on which we compare these systems are: the size of their library, the strength of their logic and their level of automation. 1
Agency, simulation and selfidentification
 Mind & Language
, 2004
"... Abstract: This paper is concerned with the problem of selfidentification in the domain of action. We claim that this problem can arise not just for the self as object, but also for the self as subject in the ascription of agency. We discuss and evaluate some proposals concerning the mechanisms invo ..."
Abstract

Cited by 22 (7 self)
 Add to MetaCart
Abstract: This paper is concerned with the problem of selfidentification in the domain of action. We claim that this problem can arise not just for the self as object, but also for the self as subject in the ascription of agency. We discuss and evaluate some proposals concerning the mechanisms involved in selfidentification and in agencyascription, and their possible impairments in pathological cases. We argue in favor of a simulation hypothesis that claims that actions, whether overt or covert, are centrally simulated by the neural network, and that this simulation provides the basis for action recognition and attribution. In this paper we will be concerned with the problem of selfidentification as it arises in the domain of action, which we will take to include both overt and covert or simulated actions. Talk of a problem of identification presupposes a contrast set, and the possibility that, in seeking to identify one self, one picks out something in the contrast set instead. With selfidentification, two contrast sets must be considered: the world at large and the set of other selves. The problem of selfidentification therefore
Autarkic Computations in Formal Proofs
 J. Autom. Reasoning
, 1997
"... Formal proofs in mathematics and computer science are being studied because these objects can be verified by a very simple computer program. An important open problem is whether these formal proofs can be generated with an effort not much greater than writing a mathematical paper in, say, L A ..."
Abstract

Cited by 21 (1 self)
 Add to MetaCart
Formal proofs in mathematics and computer science are being studied because these objects can be verified by a very simple computer program. An important open problem is whether these formal proofs can be generated with an effort not much greater than writing a mathematical paper in, say, L A T E X. Modern systems for proofdevelopment make the formalization of reasoning relatively easy. Formalizing computations such that the results can be used in formal proofs is not immediate. In this paper it is shown how to obtain formal proofs of statements like Prime(61) in the context of Peano arithmetic or (x + 1)(x + 1) = x 2 + 2x + 1 in the context of rings. It is hoped that the method will help bridge the gap between the efficient systems of computer algebra and the reliable systems of proofdevelopment. 1. The problem Usual mathematics is informal but precise. One speaks about informal rigor. Formal mathematics on the other hand consists of definitions, statements and proo...
2004), Intransitivity and Vagueness
 Proceedings of the Ninth International Conference on Principles of Knowledge Representation and Reasoning (KR
"... There are many examples in the literature that suggest that indistinguishability is intransitive, despite the fact that the indistinguishability relation is typically taken to be an equivalence relation (and thus transitive). It is shown that if the uncertainty perception and the question of when an ..."
Abstract

Cited by 14 (2 self)
 Add to MetaCart
There are many examples in the literature that suggest that indistinguishability is intransitive, despite the fact that the indistinguishability relation is typically taken to be an equivalence relation (and thus transitive). It is shown that if the uncertainty perception and the question of when an agent reports that two things are indistinguishable are both carefully modeled, the problems disappear, and indistinguishability can indeed be taken to be an equivalence relation. Moreover, this model also suggests a logic of vagueness that seems to solve many of the problems related to vagueness discussed in the philosophical literature. In particular, it is shown here how the logic can handle the sorites paradox. 1
Fuzzy rough sets: the forgotten step
 IEEE Transactions on Fuzzy Systems
, 2007
"... Abstract—Traditional rough set theory uses equivalence relations to compute lower and upper approximations of sets. The corresponding equivalence classes either coincide or are disjoint. This behaviour is lost when moving on to a fuzzy Tequivalence relation. However, none of the existing studies on ..."
Abstract

Cited by 7 (6 self)
 Add to MetaCart
Abstract—Traditional rough set theory uses equivalence relations to compute lower and upper approximations of sets. The corresponding equivalence classes either coincide or are disjoint. This behaviour is lost when moving on to a fuzzy Tequivalence relation. However, none of the existing studies on fuzzy rough set theory tries to exploit the fact that an element can belong to some degree to several “soft similarity classes ” at the same time. In this paper we show that taking this truly fuzzy characteristic into account may lead to new and interesting definitions of lower and upper approximations. We explore two of them in detail and we investigate under which conditions they differ from the commonly used definitions. Finally we show the possible practical relevance of the newly introduced approximations for query refinement. Index Terms—Fuzzy rough set, lower and upper approximation, query refinement, transitivity. I.
The Mechanics of ElectronPositron Pair Creation in the 3Spaces Model
"... Abstract: This paper lays out the mechanics of creation in the 3spaces model of an electronpositron pair as a photon of energy 1.022 MeV or more is destabilized when grazing a heavy particle such as an atom's nucleus, thus converting a massless photon into two massive.511 MeV/c 2 particles c ..."
Abstract

Cited by 6 (6 self)
 Add to MetaCart
Abstract: This paper lays out the mechanics of creation in the 3spaces model of an electronpositron pair as a photon of energy 1.022 MeV or more is destabilized when grazing a heavy particle such as an atom's nucleus, thus converting a massless photon into two massive.511 MeV/c 2 particles charged in opposition. An alternate process was also experimentally discovered in 1997, that involves converging two tightly collimated photon beams toward a single point in space, one of the beams being made up of photons exceeding the 1.022 MeV threshold. In the latter case, electron/positron pairs were created without any atom's nuclei being close by. These two observed processes of photon conversion into electronpositron pairs set the 1.022 MeV photon energy level as the threshold starting at which massless photons become highly susceptible to become destabilized into converting to pairs of massive particles. Keywords: 3spaces, electronpositron pair, 1.022 MeV photon, nature of mass, energy conversion to mass, materialization, sign of charges. I. EXPERIMENTAL PROOF OF ELECTRONPOSITRON PAIR CREATION In 1933, Blackett and Occhialini proved experimentally that cosmic radiation byproduct photons of energy 1.022 MeV or more spontaneously convert to electron/positron pairs when grazing atomic nuclei ([3]), a
On Galilean and Lorentz invariance in pilotwave dynamics
, 2008
"... It is argued that the natural kinematics of the pilotwave theory is Aristotelian. Despite appearances, Galilean invariance is not a fundamental symmetry of the lowenergy theory. Instead, it is a fictitious symmetry that has been artificially imposed. It is concluded that the search for a Lorentzi ..."
Abstract

Cited by 4 (2 self)
 Add to MetaCart
It is argued that the natural kinematics of the pilotwave theory is Aristotelian. Despite appearances, Galilean invariance is not a fundamental symmetry of the lowenergy theory. Instead, it is a fictitious symmetry that has been artificially imposed. It is concluded that the search for a Lorentzinvariant
A Mechanical Classical Laboratory Situation With A Quantum Logic Structure
 Phys. Acta
, 1992
"... this paper we will make a similar attempt but now concentrating on the logical aspects of the example. 1. CLASSICAL ENTITIES AND QUANTUM ENTITIES. In the discipline quantum logic the basic structure of research was originally the structure of orthomodular lattice. Quantumlike entities had such a st ..."
Abstract

Cited by 3 (3 self)
 Add to MetaCart
this paper we will make a similar attempt but now concentrating on the logical aspects of the example. 1. CLASSICAL ENTITIES AND QUANTUM ENTITIES. In the discipline quantum logic the basic structure of research was originally the structure of orthomodular lattice. Quantumlike entities had such a structure of orthomodular lattice for the set of their propositions, while classical entities had a structure of distributive orthomodular lattice (or Boolean algebra) for the set of their propositions. Meanwhile the classification classical entities versus quantumlike (or nonclassical entities) has been studied much more in detail, relating the corresponding structures to real physical situations [2]. We have taken a very easy criterion from the results of this research on the possibility of distinguishing between the two kind of entities classical and nonclassical. We shall consider an entity (classical or nonclassical) to be described by a set M (denoted by m, n, ...) of measurements and a set ## # # ( denoted by p, q, ...) of states. In different approaches different names have been given to these two basis sets (measurements have been called yesno experiments, questions, propositions, observables and operations, and states have also been called preparations), but mainly these differences will play no role in what we would like to show in this paper. The results can easily be translated in the proper approach. We must remark that when we use the concept 'state', then we mean 'pure state'. The more general situation of mixed states will not be considered here, since our example in any case does not contain mixed states. The following characterization of a classical entity shall be adopted (it is the characterization explicitly used in [3]) : Criterion for classical cha...
On Representing and Generating Kernels by Fuzzy Equivalence Relations
"... Kernels are twoplaced functions that can be interpreted as inner products in some Hilbert space. It is this property which makes kernels predestinated to carry linear models of learning, optimization or classification strategies over to nonlinear variants. Following this idea, various kernelbased ..."
Abstract

Cited by 2 (0 self)
 Add to MetaCart
Kernels are twoplaced functions that can be interpreted as inner products in some Hilbert space. It is this property which makes kernels predestinated to carry linear models of learning, optimization or classification strategies over to nonlinear variants. Following this idea, various kernelbased methods like support vector machines or kernel principal component analysis have been conceived which prove to be successful for machine learning, data mining and computer vision applications. When applying a kernelbased method a central question is the choice and the design of the kernel function. This paper provides a novel view on kernels based on fuzzylogical concepts which allows to incorporate prior knowledge in the design process. It is demonstrated that kernels mapping to the unit interval with constant one in its diagonal can be represented by a commonly used fuzzylogical formula for representing fuzzy rule bases. This means that a great class of kernels can be represented by fuzzylogical concepts. Apart from this result, which only guarantees the existence of such a representation, constructive examples are presented and the relation to unlabeled learning is pointed out.