Results 1  10
of
34
The power of paradox: some recent developments in interactive epistemology
 International Journal of Game Theory
, 2007
"... Abstract Paradoxes of gametheoretic reasoning have played an important role in spurring developments in interactive epistemology, the area in game theory that studies the role of the players ’ beliefs, knowledge, etc. This paper describes two such paradoxes – one concerning backward induction, the ..."
Abstract

Cited by 35 (2 self)
 Add to MetaCart
Abstract Paradoxes of gametheoretic reasoning have played an important role in spurring developments in interactive epistemology, the area in game theory that studies the role of the players ’ beliefs, knowledge, etc. This paper describes two such paradoxes – one concerning backward induction, the other iterated weak dominance. We start with the basic epistemic condition of “rationality and common belief of rationality ” in a game, describe various ‘refinements ’ of this condition that have been proposed, and explain how these refinements resolve the two paradoxes. We will see that a unified epistemic picture of game theory emerges. We end with some new foundational questions uncovered by the epistemic program. 1
A Case Study in RealTime Parallel Computation: Correcting Algorithms
 Journal of Parallel and Distributed Computing
, 2001
"... A correcting algorithm is one that receives an endless stream of corrections to its initial input data and terminates when all the corrections received have been taken into account. We give a characterization of correcting algorithms based on the theory of dataaccumulating algorithms. In particular ..."
Abstract

Cited by 21 (19 self)
 Add to MetaCart
(Show Context)
A correcting algorithm is one that receives an endless stream of corrections to its initial input data and terminates when all the corrections received have been taken into account. We give a characterization of correcting algorithms based on the theory of dataaccumulating algorithms. In particular, it is shown that any correcting algorithm exhibits superunitary behavior in a parallel computation setting if and only if the static counterpart of that correcting algorithm manifests a strictly superunitary speedup. Since both classes of correcting and dataaccumulating algorithms are included in the more general class of realtime algorithms, we show in fact that many problems from this class manifest superunitary behavior. Moreover, we give an example of a realtime parallel computation that pertains to neither of the two classes studied (namely, correcting and dataaccumulating algorithms), but still manifests superunitary behavior. Because of the aforementioned results, the usual measures of performance for parallel algorithms (that is, speedup and efficiency) lose much of their ability to convey effectively the nature of the phenomenon taking place in the realtime case. We propose therefore a more expressive measure that captures all the relevant parameters of the computation. Our proposal is made in terms of a graphical representation. We state as an open problem the investigation of such a measure, including nding an analytical form for it.
From Heisenberg to Gödel via Chaitin
, 2008
"... In 1927 Heisenberg discovered that the “more precisely the position is determined, the less precisely the momentum is known in this instant, and vice versa”. Four years later Gödel showed that a finitely specified, consistent formal system which is large enough to include arithmetic is incomplete. A ..."
Abstract

Cited by 11 (9 self)
 Add to MetaCart
In 1927 Heisenberg discovered that the “more precisely the position is determined, the less precisely the momentum is known in this instant, and vice versa”. Four years later Gödel showed that a finitely specified, consistent formal system which is large enough to include arithmetic is incomplete. As both results express some kind of impossibility it is natural to ask whether there is any relation between them, and, indeed, this question has been repeatedly asked for a long time. The main interest seems to have been in possible implications of incompleteness to physics. In this note we will take interest in the converse implication and will offer a positive answer to the question: Does uncertainty imply incompleteness? We will show that algorithmic randomness is equivalent to a “formal uncertainty principle ” which implies Chaitin’s informationtheoretic incompleteness. We also show that the derived uncertainty relation, for many computers, is physical. This fact supports the conjecture that uncertainty implies randomness not only in mathematics, but also in physics.
Complexity, Deconstruction, and Relativism
 Theory, Culture & Society
, 2005
"... The online version of this article can be found at: ..."
Abstract

Cited by 11 (2 self)
 Add to MetaCart
(Show Context)
The online version of this article can be found at:
Is Complexity a Source of Incompleteness?
 IS COMPLEXITY A SOURCE OF INCOMPLETENESS
, 2004
"... ..."
Mathematical proofs at a crossroad
 Theory Is Forever, Lectures Notes in Comput. Sci. 3113
, 2004
"... Abstract. For more than 2000 years, from Pythagoras and Euclid to Hilbert and Bourbaki, mathematical proofs were essentially based on axiomaticdeductive reasoning. In the last decades, the increasing length and complexity of many mathematical proofs led to the expansion of some empirical, experimen ..."
Abstract

Cited by 7 (7 self)
 Add to MetaCart
(Show Context)
Abstract. For more than 2000 years, from Pythagoras and Euclid to Hilbert and Bourbaki, mathematical proofs were essentially based on axiomaticdeductive reasoning. In the last decades, the increasing length and complexity of many mathematical proofs led to the expansion of some empirical, experimental, psychological and social aspects, yesterday only marginal, but now changing radically the very essence of proof. In this paper, we try to organize this evolution, to distinguish its different steps and aspects, and to evaluate its advantages and shortcomings. Axiomaticdeductive proofs are not a posteriori work, a luxury we can marginalize nor are computerassisted proofs bad mathematics. There is hope for integration! 1
Entropic Principles
, 2008
"... We discuss the evolution of radiation and BekensteinHawking entropies in expanding isotropic universes. We establish a general relation which shows why it is inevitable that there is currently a huge difference in the numerical values of these two entropies. Some anthropic constraints on their valu ..."
Abstract

Cited by 2 (0 self)
 Add to MetaCart
(Show Context)
We discuss the evolution of radiation and BekensteinHawking entropies in expanding isotropic universes. We establish a general relation which shows why it is inevitable that there is currently a huge difference in the numerical values of these two entropies. Some anthropic constraints on their values are given and other aspects of the cosmological ’entropy gap ’ problem are discussed. The coincidence of the classical and quantum entropies for black holes with Hawking lifetime equal to the age of the universe, and hence of radius equal to the proton size, is shown to be identical to the condition that we obseve the universe at the main sequence lifetime. 1.
Realism for scientific ontologies
"... Abstract. Science aims to develop an accurate understanding of reality through a variety of rigorously empirical and formal methods. Ontologies are used to formalize the meaning of terms within a domain of discourse. The Basic Formal Ontology (BFO) is an ontology of particular importance in the bio ..."
Abstract
 Add to MetaCart
(Show Context)
Abstract. Science aims to develop an accurate understanding of reality through a variety of rigorously empirical and formal methods. Ontologies are used to formalize the meaning of terms within a domain of discourse. The Basic Formal Ontology (BFO) is an ontology of particular importance in the biomedical domains, where it provides the toplevel for numerous ontologies, including those admitted as part of the OBO Foundry collection. The BFO requires that all classes in an ontology are actually instantiated in reality. Despite the fact that it is hard to show whether entities of some kind exist or do not exist in reality (especially for unobservable entities like elementary particles), this criterion fails to satisfy the need of scientists to communicate their findings and theories unambiguously. We discuss the problems that arise due to the BFO’s realism criterion and suggest viable alternatives.
Identity Expansion and Transcendence
"... Emerging developments in communications and computing technology may transform the nature of human identity, in the process rendering obsolete the traditional philosophical and scientific frameworks for understanding the nature of individuals and groups. Progress toward an evaluation of this possibi ..."
Abstract
 Add to MetaCart
(Show Context)
Emerging developments in communications and computing technology may transform the nature of human identity, in the process rendering obsolete the traditional philosophical and scientific frameworks for understanding the nature of individuals and groups. Progress toward an evaluation of this possibility and an appropriate conceptual basis for analyzing it may be derived from two very different but ultimately connected social movements that promote this radical change. One is the governmentally supported exploration of Converging Technologies, based in the unification of nanoscience, biology, information science and cognitive science (NBIC). The other is the Transhumanist movement, which has been criticized as excessively radical yet is primarily conducted as a dignified intellectual discussion within a new school of philosophy about human enhancement. Together, NBIC and Transhumanism suggest the immense transformative power of today’s technologies, through which individuals may explore multiple identities by means of online avatars, semiautonomous intelligent agents, and other identity expansions.