Results 1  10
of
52
The Small Scale Structure of SpaceTime: A Bibliographical Review
, 1995
"... This essay is a tour around many of the lesser known pregeometric models of physics, as well as the mainstream approaches to quantum gravity, in search of common themes which may provide a glimpse of the final theory which must lie behind them. 1 ..."
Abstract

Cited by 21 (0 self)
 Add to MetaCart
(Show Context)
This essay is a tour around many of the lesser known pregeometric models of physics, as well as the mainstream approaches to quantum gravity, in search of common themes which may provide a glimpse of the final theory which must lie behind them. 1
What is good mathematics
, 2007
"... Abstract. Some personal thoughts and opinions on what “good quality mathematics” is, and whether one should try to define this term rigorously. As a case study, the story of Szemerédi’s theorem is presented. 1. The many aspects of mathematical quality We all agree that mathematicians should strive t ..."
Abstract

Cited by 9 (0 self)
 Add to MetaCart
(Show Context)
Abstract. Some personal thoughts and opinions on what “good quality mathematics” is, and whether one should try to define this term rigorously. As a case study, the story of Szemerédi’s theorem is presented. 1. The many aspects of mathematical quality We all agree that mathematicians should strive to produce good mathematics. But how does one define “good mathematics”, and should one even dare to try at all? Let us first consider the former question. Almost immediately one realises that there are many different types of mathematics which could be designated “good”. For instance, “good mathematics ” could refer (in no particular order) to (i) Good mathematical problemsolving (e.g. a major breakthrough on an important mathematical problem); (ii) Good mathematical technique (e.g. a masterful use of existing methods, or the development of new tools); (iii) Good mathematical theory (e.g. a conceptual framework or choice of notation which systematically unifies and generalises an existing body of results);
Bayesian inference, and the geometry of the space of probability distributions
 in Advances in Minimum Description Length: Theory and Applications, P. Grünwald,I.J.Myung,andM.Pitt,Eds.,pp.81–98,TheMIT
, 2006
"... The Minimum Description Length (MDL) approach to parametric model selection chooses a model that provides the shortest codelength for data, while the Bayesian approach selects the model that yields the highest likelihood for the data. In this article I describe how the Bayesian approach yields essen ..."
Abstract

Cited by 8 (0 self)
 Add to MetaCart
(Show Context)
The Minimum Description Length (MDL) approach to parametric model selection chooses a model that provides the shortest codelength for data, while the Bayesian approach selects the model that yields the highest likelihood for the data. In this article I describe how the Bayesian approach yields essentially the same model selection criterion as MDL provided one chooses a Jeffreys prior for the parameters. Both MDL and Bayesian methods penalize complex models until a sufficient amount of data has justified their selection. I show how these complexity penalties can be understood in terms of the geometry of parametric model families seen as surfaces embedded in the space of distributions. I arrive at this understanding by asking how many different, or distinguishable, distributions are contained in a parametric model family. By answering this question, I find that the Jeffreys prior of Bayesian methods measures the density of distinguishable distributions contained in a parametric model family in a reparametrization independent way. This leads to a picture where the complexity of a model family is related to the fraction of its volume in the space of distributions that lies close to the truth. 1
Gravity as an emergent phenomenon: A Conceptual description
 AIP Conference Proceedings, 989 114 (2007) [arXiv:0706.1654
"... Abstract. I describe several broad features of a programme to understand gravity as an emergent, long wavelength, phenomenon (like elasticity) and discuss one concrete framework for realizing this paradigm in the backdrop of several recent results. ..."
Abstract

Cited by 7 (1 self)
 Add to MetaCart
(Show Context)
Abstract. I describe several broad features of a programme to understand gravity as an emergent, long wavelength, phenomenon (like elasticity) and discuss one concrete framework for realizing this paradigm in the backdrop of several recent results.
Neocybernetics in biological systems
, 2006
"... This report summarizes ten levels of abstraction that together span the continuum from the most elementary to the most general levels when modeling biological systems. It is shown how the neocybernetic principles can be seen as the key to reaching a holistic view of complex processes in general. Pre ..."
Abstract

Cited by 4 (3 self)
 Add to MetaCart
(Show Context)
This report summarizes ten levels of abstraction that together span the continuum from the most elementary to the most general levels when modeling biological systems. It is shown how the neocybernetic principles can be seen as the key to reaching a holistic view of complex processes in general. Preface Concrete examples help to understand complex systems. In this report, the key point is to illustrate the basic mechanisms and properties of neocybernetic system models. Good visualizations are certainly needed. It is biological systems, or living systems, that are perhaps the most characteristic examples of cybernetic systems. This intuition is extended here to natural systems in general — indeed, it is all other than manmade ones that seem to be cybernetic. The word “biological ” in the title should be interpreted as “biological ” — referring to general studies of any living systems, independent of the phenosphere. Starting from the concrete examples, connections to more abstract systems are found, and the discussions become more and more allembracing in this text. However, the neocybernetic model framework still makes it possible to conceptually master the complexity. There is more information about neocybernetics available in Internet — also this report is available there in electronic form:
The Development of Models of Computation with Advances in Technology and Natural Sciences
"... Abstract. The development of models of computation induces the development of technology and natural sciences and vice versa. Current state of the art of technology and sciences, especially networks of concurrent processes such as Internet or biological and sociological systems, calls for new comput ..."
Abstract

Cited by 2 (2 self)
 Add to MetaCart
(Show Context)
Abstract. The development of models of computation induces the development of technology and natural sciences and vice versa. Current state of the art of technology and sciences, especially networks of concurrent processes such as Internet or biological and sociological systems, calls for new computational models. It is necessary to extend classical Turing machine model towards physical / natural computation. Important aspects are openness and interactivity of computational systems, as well as concurrency of computational processes. The development proceeds in two directions – as a search for new mathematical structures beyond algorithms as well as a search for different modes of physical computation that are not equivalent to actions of human executing an algorithm, but appear in physical systems in which concurrent interactive information processing takes place. The article presents the framework of infocomputationalism as applied on computing nature, where nature is an informational structure and its dynamics (information processing) is understood as computation. In natural computing, new developments in both understanding of natural systems and in their computational modelling are needed, and those two converge and enhance each other. 1 INTRODUCTION: WHAT IS COMPUTING
Article Constructive Verification, Empirical Induction, and Falibilist Deduction: A Threefold Contrast
, 2011
"... information ..."
Info‐Computational Philosophy Of Nature: An Informational Universe With Computational Dynamics
, 2011
"... Starting with the Søren Brier’s Cybersemiotic critique of the existing practice of Wissenshaft, this article develops the argument for an alternative naturalization of knowledge production. It presents the framework of natural info‐computationalism, ICON, as a new Natural Philosophy based on concep ..."
Abstract

Cited by 2 (0 self)
 Add to MetaCart
Starting with the Søren Brier’s Cybersemiotic critique of the existing practice of Wissenshaft, this article develops the argument for an alternative naturalization of knowledge production. It presents the framework of natural info‐computationalism, ICON, as a new Natural Philosophy based on concepts of information (structure) and computation (process). In this approach, which is a synthesis of informationalism (the view that nature is informational) and computationalism (the view that nature computes its own time development), computation is in general not a substrate‐independent disembodied symbol manipulation. Based on the informational character of nature, where matter and informational structure are equivalent, information processing in general is embodied and in general substrate specific. The Turing Machine model of abstract discrete sequential symbol manipulation is a subset of the Natural computing model. With this generalized idea of Natural computing and Informational Structural Realism, Info‐computationalism (ICON), adopting scientific third‐person account, covers the entire list of requirements for naturalist knowledge production framework from Brier (2010) except for qualia as experienced in a first‐person mode.
Name Strategy: Its Existence and Implications
, 2005
"... It is argued that colour name strategy, object name strategy, and chunking strategy in memory are all aspects of the same general phenomena, called stereotyping, and this in turn is an example of a knowhow representation. Such representations are argued to have their origin in a principle called th ..."
Abstract

Cited by 2 (2 self)
 Add to MetaCart
(Show Context)
It is argued that colour name strategy, object name strategy, and chunking strategy in memory are all aspects of the same general phenomena, called stereotyping, and this in turn is an example of a knowhow representation. Such representations are argued to have their origin in a principle called the minimum duplication of resources. For most the subsequent discussions existence of colour name strategy suffices. It is pointed out that the BerlinKay universal partial ordering of colours and the frequency of traffic accidents classified by colour are surprisingly similar; a detailed analysis is not carried out as the specific colours recorded are not identical. Some consequences of the existence of a name strategy for the philosophy of language and mathematics are discussed: specifically it is argued that in accounts of truth and meaning it is necessary throughout to use real numbers as opposed to bivalent quantities; and also that the concomitant label associated with sentences should not be of unconditional truth, but rather several real valued quantities associated with visual communication. The implication of real valued truth quantities is that the CONTINUUM HYPOTHESIS of pure mathematics is sidestepped, because real valued quantities occur ab initio. The existence of name strategy shows that thought/sememes and talk/phonemes can be separate, and this vindicates the assumption of thought occurring before talk used in psycholinguistic speech production models. Copyright c 20032005 Yang's Scientific Research Institute, LLC. All rights reserved. Index Terms Radical interpretation.
Virtual Science  Virtuality And Knowledge Acquisition in . . .
 IN VIRTUAL REALITY: COGNITIVE FOUNDATIONS, TECHNOLOGICAL ISSUES & PHILOSOPHICAL IMPLICATIONS. FRANKFURT AM MAIN: PETER LANG
, 2001
"... ..."