Results 1  10
of
37
A Theory of Program Size Formally Identical to Information Theory
, 1975
"... A new definition of programsize complexity is made. H(A;B=C;D) is defined to be the size in bits of the shortest selfdelimiting program for calculating strings A and B if one is given a minimalsize selfdelimiting program for calculating strings C and D. This differs from previous definitions: (1) ..."
Abstract

Cited by 333 (16 self)
 Add to MetaCart
A new definition of programsize complexity is made. H(A;B=C;D) is defined to be the size in bits of the shortest selfdelimiting program for calculating strings A and B if one is given a minimalsize selfdelimiting program for calculating strings C and D. This differs from previous definitions: (1) programs are required to be selfdelimiting, i.e. no program is a prefix of another, and (2) instead of being given C and D directly, one is given a program for calculating them that is minimal in size. Unlike previous definitions, this one has precisely the formal 2 G. J. Chaitin properties of the entropy concept of information theory. For example, H(A;B) = H(A) + H(B=A) + O(1). Also, if a program of length k is assigned measure 2 \Gammak , then H(A) = \Gamma log 2 (the probability that the standard universal computer will calculate A) +O(1). Key Words and Phrases: computational complexity, entropy, information theory, instantaneous code, Kraft inequality, minimal program, probab...
Making sense of randomness: Implicit encoding as a basis for judgment
 Psychological Review
, 1997
"... People attempting to generate random sequences usually produce more alternations than expected by chance. They also judge overalternating sequences as maximally random. In this article, the authors review findings, implications, and explanatory mechanisms concerning subjective randomness. The author ..."
Abstract

Cited by 35 (0 self)
 Add to MetaCart
People attempting to generate random sequences usually produce more alternations than expected by chance. They also judge overalternating sequences as maximally random. In this article, the authors review findings, implications, and explanatory mechanisms concerning subjective randomness. The authors next present the general approach of the mathematical theory of complexity, which identifies the length of the shortest program for reproducing a sequence with its degree of randomness. They describe three experiments, based on mean group responses, indicating that the perceived randomness of a sequence is better predicted by various measures of its encoding difficulty than by its objective randomness. These results seem to imply that in accordance with the complexity view, judging the extent of a sequence's randomness is based on an attempt to mentally encode it. The experience of randomness may result when this attempt fails. Judging a situation as more or less random is often the key to important cognitions and behaviors. Perceiving a situation as nonchance calls for explanations, and it marks the onset of inductive inference (Lopes, 1982). Lawful environments encourage a coping orientation. One may try to control a situation
System Identification, Approximation and Complexity
 International Journal of General Systems
, 1977
"... This paper is concerned with establishing broadlybased systemtheoretic foundations and practical techniques for the problem of system identification that are rigorous, intuitively clear and conceptually powerful. A general formulation is first given in which two order relations are postulated on a ..."
Abstract

Cited by 34 (23 self)
 Add to MetaCart
This paper is concerned with establishing broadlybased systemtheoretic foundations and practical techniques for the problem of system identification that are rigorous, intuitively clear and conceptually powerful. A general formulation is first given in which two order relations are postulated on a class of models: a constant one of complexity; and a variable one of approximation induced by an observed behaviour. An admissible model is such that any less complex model is a worse approximation. The general problem of identification is that of finding the admissible subspace of models induced by a given behaviour. It is proved under very general assumptions that, if deterministic models are required then nearly all behaviours require models of nearly maximum complexity. A general theory of approximation between models and behaviour is then developed based on subjective probability concepts and semantic information theory The role of structural constraints such as causality, locality, finite memory, etc., are then discussed as rules of the game. These concepts and results are applied to the specific problem or stochastic automaton, or grammar, inference. Computational results are given to demonstrate that the theory is complete and fully operational. Finally the formulation of identification proposed in this paper is analysed in terms of Klir’s epistemological hierarchy and both are discussed in terms of the rich philosophical literature on the acquisition of knowledge. 1
IntervalValued Probabilities
, 1998
"... 0 =h 0 in the diagram. The sawtooth line reflects the fact that even when the principle of indifference can be applied, there may be arguments whose strength can be bounded no more precisely than by an adjacent pair of indifference arguments. Note that a=h in the diagram is bounded numerically on ..."
Abstract

Cited by 21 (1 self)
 Add to MetaCart
0 =h 0 in the diagram. The sawtooth line reflects the fact that even when the principle of indifference can be applied, there may be arguments whose strength can be bounded no more precisely than by an adjacent pair of indifference arguments. Note that a=h in the diagram is bounded numerically only by 0.0 and the strength of a 00 =h 00 . Keynes' ideas were taken up by B. O. Koopman [14, 15, 16], who provided an axiomatization for Keynes' probability values. The axioms are qualitative, and reflect what Keynes said about probability judgment. (It should be remembered that for Keynes probability judgment was intended to be objective in the sense that logic is objective. Although different people may accept different premises, whether or not a conclusion follows logically from a given set of premises is objective. Though Ramsey [26] attacked this aspect of Keynes' theory, it can be argued
Conjoint Probabilistic Subband Modeling
 MASSACHUSETTS INSTITUTE OF TECHNOLOGY
, 1997
"... A new approach to highorderconditional probability density estimation is developed, based on a partitioning of conditioning space via decision trees. The technique is applied to image compression, image restoration, and texture synthesis, and the results compared with those obtained by standard mi ..."
Abstract

Cited by 14 (0 self)
 Add to MetaCart
A new approach to highorderconditional probability density estimation is developed, based on a partitioning of conditioning space via decision trees. The technique is applied to image compression, image restoration, and texture synthesis, and the results compared with those obtained by standard mixture density and linear regression models. By applying the technique to subbanddomain processing, some evidence is provided to support the following statement: the appropriate tradeoff between spatial and spectral localization in linear preprocessing shifts towards greater spatial localization when subbands are processed in a way that exploits interdependence.
Training teachers to teach probability
 Journal of Statistical Education
, 2004
"... In this paper we analyse the reasons why teaching probability is difficult for mathematics teachers, we describe the contents needed in the didactical preparation of teachers to teach probability and we present examples of activities to carry out this training. Nowadays probability and statistics is ..."
Abstract

Cited by 12 (1 self)
 Add to MetaCart
In this paper we analyse the reasons why teaching probability is difficult for mathematics teachers, we describe the contents needed in the didactical preparation of teachers to teach probability and we present examples of activities to carry out this training. Nowadays probability and statistics is part of the mathematics curricula for primary and secondary school in many countries. The reasons to include this teaching have been repeatedly highlighted over the past 20 years (e.g. Holmes, 1980; Hawkins et al., 1991; VereJones, 1995), and include the usefulness of statistics and probability for daily life, its instrumental role in other disciplines, the need for a basic stochastic knowledge in many professions and its role in developing a critical reasoning. However, teaching probability and statistics is not easy for mathematics teachers. Primary and secondary level mathematics teachers frequently lack specific preparation in statistics education. For example, in Spain, prospective secondary teachers with a major in Mathematics do not receive specific training in statistics education. The situation is even worse for primary teachers, most of whom have not had basic training in statistics and this could be extended to many countries. There can be little support from textbooks and curriculum documents prepared for primary and secondary teachers, because
On modeling uncertainty with interval structures
 Computational Intelligence
, 1995
"... In this paper, we introduce the notion of interval structures in an attempt to establish a unified framework for representing uncertain information. Two views are suggested for the interpretation of an interval structure. A typical example using the compatibility view is the roughset model in which ..."
Abstract

Cited by 12 (7 self)
 Add to MetaCart
In this paper, we introduce the notion of interval structures in an attempt to establish a unified framework for representing uncertain information. Two views are suggested for the interpretation of an interval structure. A typical example using the compatibility view is the roughset model in which the lower and upper approximations form an interval structure. Incidence calculus adopts the allocation view in which an interval structure is defined by the tightest lower and upper incidence bounds. The relationship between interval structures and intervalbased numeric belief and plausibility functions is also examined. As an application of the proposed model, an algorithm is developed for computing the tightest incidence bounds.
The Maximum Entropy Approach and Probabilistic IR Models
 ACM TRANSACTIONS ON INFORMATION SYSTEMS
, 1998
"... The Principle of Maximum Entropy is discussed and two classic probabilistic models of information retrieval, the Binary Independence Model of Robertson and Sparck Jones and the Combination Match Model of Croft and Harper are derived using the maximum entropy approach. The assumptions on which the cl ..."
Abstract

Cited by 12 (0 self)
 Add to MetaCart
The Principle of Maximum Entropy is discussed and two classic probabilistic models of information retrieval, the Binary Independence Model of Robertson and Sparck Jones and the Combination Match Model of Croft and Harper are derived using the maximum entropy approach. The assumptions on which the classical models are based are not made. In their place, the probability distribution of maximum entropy consistent with a set of constraints is determined. It is argued that this subjectivist approach is more philosophically coherent than the frequentist conceptualization of probability that is often assumed as the basis of probabilistic modeling and that this philosophical stance has important practical consequences with respect to the realization of information retrieval research.
Ensembles and Experiments in Classical and Quantum Physics
 Int. J. Mod. Phys. B
, 2003
"... A philosophically consistent axiomatic approach to classical and quantum mechanics is given. The approach realizes a strong formal implementation of Bohr's correspondence principle. In all instances, classical and quantum concepts are fully parallel: the same general theory has a classical realizati ..."
Abstract

Cited by 8 (5 self)
 Add to MetaCart
A philosophically consistent axiomatic approach to classical and quantum mechanics is given. The approach realizes a strong formal implementation of Bohr's correspondence principle. In all instances, classical and quantum concepts are fully parallel: the same general theory has a classical realization and a quantum realization.
Relative states and the environment: Einselection, envariance, quantum darwinism, and the existential interpretation
, 2007
"... Starting with basic axioms of quantum theory we revisit “Relative State Interpretation ” set out 50 years ago by Hugh Everett III (1957a,b). His approach explains “collapse of the wavepacket ” by postulating that observer perceives the state of the “rest of the Universe ” relative to his own state, ..."
Abstract

Cited by 7 (0 self)
 Add to MetaCart
Starting with basic axioms of quantum theory we revisit “Relative State Interpretation ” set out 50 years ago by Hugh Everett III (1957a,b). His approach explains “collapse of the wavepacket ” by postulating that observer perceives the state of the “rest of the Universe ” relative to his own state, or – to be more precise – relative to the state of his records. This allows quantum theory to be universally valid. However, while Everett explains perception of collapse, relative state approach raises three questions absent in Bohr’s Copenhagen Interpretation which relied on independent existence of an ab intio classical domain. One is now forced one to seek sets of preferred, effectively classical but ultimately quantum states that can define branches of the universal state vector, and allow observers to keep reliable records. Without such (i) preferred basis relative states are “too relative”, and the approach suffers from basis ambiguity. Moreover, universal validity of quantum theory raises the issue of the (ii) origin of probabilities, and of the Born’s rule pk = ψk  2 which is simply postulated in textbook discussions. Last not least, even preferred quantum states (defined e.g. by the einselection – environment induced superselection) – are still quantum. Therefore they cannot be found out by initially ignorant observers through direct measurement without getting disrupted. Yet, states of macroscopic object exist objectively and can be found out by anyone. So,