Results 1  10
of
14
ON INTERPRETING CHAITIN’S INCOMPLETENESS THEOREM
, 1998
"... The aim of this paper is to comprehensively question the validity of the standard way of interpreting Chaitin’s famous incompleteness theorem, which says that for every formalized theory of arithmetic there is a finite constant c such that the theory in question cannot prove any particular number ..."
Abstract

Cited by 9 (0 self)
 Add to MetaCart
The aim of this paper is to comprehensively question the validity of the standard way of interpreting Chaitin’s famous incompleteness theorem, which says that for every formalized theory of arithmetic there is a finite constant c such that the theory in question cannot prove any particular number to have Kolmogorov complexity larger than c. The received interpretation of theorem claims that the limiting constant is determined by the complexity of the theory itself, which is assumed to be good measure of the strength of the theory. I exhibit certain strong counterexamples and establish conclusively that the received view is false. Moreover, I show that the limiting constants provided by the theorem do not in any way reflect the power of formalized theories, but that the values of these constants are actually determined by the chosen coding of Turing machines, and are thus quite accidental.
Computers, Reasoning and Mathematical Practice
"... ion in itself is not the goal: for Whitehead [117]"it is the large generalisation, limited by a happy particularity, which is the fruitful conception." As an example consider the theorem in ring theory, which states that if R is a ring, f(x) is a polynomial over R and f(r) = 0 for every element of ..."
Abstract

Cited by 6 (2 self)
 Add to MetaCart
ion in itself is not the goal: for Whitehead [117]"it is the large generalisation, limited by a happy particularity, which is the fruitful conception." As an example consider the theorem in ring theory, which states that if R is a ring, f(x) is a polynomial over R and f(r) = 0 for every element of r of R then R is commutative. Special cases of this, for example f(x) is x 2 \Gamma x or x 3 \Gamma x, can be given a first order proof in a few lines of symbol manipulation. The usual proof of the general result [20] (which takes a semester's postgraduate course to develop from scratch) is a corollary of other results: we prove that rings satisfying the condition are semisimple artinian, apply a theorem which shows that all such rings are matrix rings over division rings, and eventually obtain the result by showing that all finite division rings are fields, and hence commutative. This displays von Neumann's architectural qualities: it is "deep" in a way in which the symbol manipulati...
The Political Entropy of Vote Choice: An Empirical Test of Uncertainty Reduction
 Presented at the 1997 Annual Meeting of the American Political Science Association, August 27–31
, 1997
"... Recent literature in voting theory has developed the idea that individual voting preferences are probabilistic rather than strictly deterministic. This work builds upon spatial voting models (Enelow and Hinich 1981, Ferejohn and Fiorina 1974, Davis, DeGroot and Hinich 1972, Farquharson 1969) by intr ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
Recent literature in voting theory has developed the idea that individual voting preferences are probabilistic rather than strictly deterministic. This work builds upon spatial voting models (Enelow and Hinich 1981, Ferejohn and Fiorina 1974, Davis, DeGroot and Hinich 1972, Farquharson 1969) by introducing probabilistic uncertainty into the calculus of voting decision on an individual level. Some suggest that the voting decision can be modeled with traditional probabilistic tools of uncertainty (Coughlin 1990, Coughlin and Nitzen 1981). Entropy is a measure of uncertainty that originated in statistical thermodynamics. Essentially, entropy indicates the amount of uncertainty in probability distributions (Soofi 1992), or it can be thought of as signifying a lack of human knowledge about some random event (Denbigh and Denbigh, 1985). Entropy in statistics developed with Kolmogorov (1959), Kinchin (1957), and Shannon (1948), but has rarely been applied to social science problems. Exception...
KLTOOL: A Mathematical Tool for Analyzing Spatiotemporal Data
, 1992
"... Introduction 2. Spatiotemporal Data 3. Dynamical Systems Concepts 4. KarhunenLove Decomposition 5. Overview of kltool 6. Examples 7. Future Directions 8. Summary Bibliography Appendix: Galrkin Projection for KuramotoSivashinsky PDE 1. Introduction The quantitative analysis of lowdimensional ch ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
Introduction 2. Spatiotemporal Data 3. Dynamical Systems Concepts 4. KarhunenLove Decomposition 5. Overview of kltool 6. Examples 7. Future Directions 8. Summary Bibliography Appendix: Galrkin Projection for KuramotoSivashinsky PDE 1. Introduction The quantitative analysis of lowdimensional chaotic dynamical systems has been an active area of research for many years. Up until now, most work has concentrated on the analysis of time series data from laboratory experiments and numerical simulations. Examples include RayleighBnard convection, CouetteTaylor fluid flow, and the BelousovZhabotinskii chemical reaction [Libchaber, Fauve & Laroche '83], [Roux '83] and [Swinney '84]. The key idea is to reconstruct a representation of the underlying attractor from the time series. (The timedelay embedding method [Takens '81] is one popular approach). Given the reconstructed attractor, it is possible to estimate various properties of the dynamics  Lyapunov e
Research Agenda for Integrated Landscape Modeling
, 2007
"... Authors Reliable predictions of how changing climate and disturbance regimes will affect forest ecosystems are crucial for effective forest management. Current fire and climate research in forest ecosystem and community ecology offers data and methods that can inform such predictions. However, resea ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
Authors Reliable predictions of how changing climate and disturbance regimes will affect forest ecosystems are crucial for effective forest management. Current fire and climate research in forest ecosystem and community ecology offers data and methods that can inform such predictions. However, research in these fields occurs at different scales, with disparate goals, methods, and context. Often results are not readily comparable among studies and defy integration. We discuss the strengths and weaknesses of three modeling paradigms: empirical gradient models, mechanistic ecosystem models, and stochastic landscape disturbance models. We then propose a synthetic approach to multiscale analysis of the effects of climatic change and disturbance on forest ecosystems. Empirical gradient models provide an anchor and spatial template for standlevel forest ecosystem models by quantifying key parameters for individual species and accounting for broadscale geographic variation among them. Gradient imputation transfers predictions of finescale forest composition and structure across geographic space. Mechanistic ecosystem dynamic models predict the responses of biological variables to specific environmental drivers and facilitate understanding of temporal dynamics and disequilibrium. Stochastic landscape dynamics models predict frequency, extent, and severity of broadscale disturbance. A robust linkage of these three modeling paradigms will facilitate prediction of the effects of altered fire and other disturbance regimes on forest ecosystems at multiple scales and in the context of climatic variability and change.
LIMITED COMPLEX SYSTEMS IN CRISIS: THE DEVELOPMENT PROCESS UNDER CONDITIONS OF URGENT STRESS* CONTENTS
, 1995
"... ..."
Chaos and fractals in Otolaryngology
"... Introduction Aeons of scientific thought have been dominated by the Newtonian philosophy that physical systems in the cosmos are predictable. Given the exact knowledge of a system initial condition and the physical laws that govern it, it is possible to predict its longterm behaviour. However, the ..."
Abstract
 Add to MetaCart
Introduction Aeons of scientific thought have been dominated by the Newtonian philosophy that physical systems in the cosmos are predictable. Given the exact knowledge of a system initial condition and the physical laws that govern it, it is possible to predict its longterm behaviour. However, there are systems whose nature defies any practical attempt to predict their behaviour. Such an example is the weather system, which even with the latest technology, it is difficult to forecast accurately beyond two or three days. However, what is responsible for this unpredictability? Is it a mere lack of adequate model/data or is it an intrinsic property of the system? This dichotomy and its causes is exactly what chaos theory attempts to investigate and illuminate. The concept of chaos (or deterministic chaos) emerged in between early 60's and early 70's in theoretical and applied mathematics and it is now engulfed in the theory of nonlinear dynamics
Path Dependence, its critics . . .
, 2000
"... The concept of path dependence refers to a property of contingent, nonreversible dynamical processes, including a wide array of biological and social processes that can properly be described as `evolutionary'. To dispel existing confusions in the literature, and clarify the meaning and significance ..."
Abstract
 Add to MetaCart
The concept of path dependence refers to a property of contingent, nonreversible dynamical processes, including a wide array of biological and social processes that can properly be described as `evolutionary'. To dispel existing confusions in the literature, and clarify the meaning and significance of path dependence for economists, the paper formulates definitions that relate the phenomenon to the property of nonergodicity in stochastic processes; it examines the nature of the relationship between between path dependence and `market failure', and discusses the meaning of `lockin'. Unlike tests for the presence of nonergodicity, assessments of the economic significance of path dependence are shown to involve difficult issues of counterfactual specification, and the welfare evaluation of alternative dynamic paths rather than terminal states. The policy implications of the existence of path dependence are shown to be more subtle and, as a rule, quite different from those which have been presumed by critics of the concept. A concluding section applies the notion of `lockin' reflexively to the evolution of economic analysis, suggesting that resistence to historical economics is a manifestation of `sunk cost hysteresis' in the sphere of human cognitive development.
DYNAMICAL SYSTEMS: REGULARITY AND CHAOS
"... In very general terms, we call DYNAMICAL any kind of “system ” which evolves in time, starting from an initial time t0, and whose state at any later time t> t0 can be explicitly and uniquely determined from the assumed knowledge of its initial state at t = t0 †. One of the major goals of the theory ..."
Abstract
 Add to MetaCart
In very general terms, we call DYNAMICAL any kind of “system ” which evolves in time, starting from an initial time t0, and whose state at any later time t> t0 can be explicitly and uniquely determined from the assumed knowledge of its initial state at t = t0 †. One of the major goals of the theory of dynamical systems it to understand how
Construction of Predictability: The overlooked fixed costs
, 2004
"... Technological capabilities, invisible infrastructure and the unsocial construction of predictability: the overlooked fixed costs of useful research ..."
Abstract
 Add to MetaCart
Technological capabilities, invisible infrastructure and the unsocial construction of predictability: the overlooked fixed costs of useful research