Results 1  10
of
309
Prior Probabilities
 IEEE Transactions on Systems Science and Cybernetics
, 1968
"... e case of location and scale parameters, rate constants, and in Bernoulli trials with unknown probability of success. In realistic problems, both the transformation group analysis and the principle of maximum entropy are needed to determine the prior. The distributions thus found are uniquely determ ..."
Abstract

Cited by 249 (4 self)
 Add to MetaCart
(Show Context)
e case of location and scale parameters, rate constants, and in Bernoulli trials with unknown probability of success. In realistic problems, both the transformation group analysis and the principle of maximum entropy are needed to determine the prior. The distributions thus found are uniquely determined by the prior information, independently of the choice of parameters. In a certain class of problems, therefore, the prior distributions may now be claimed to be fully as "objective" as the sampling distributions. I. Background of the problem Since the time of Laplace, applications of probability theory have been hampered by difficulties in the treatment of prior information. In realistic problems of decision or inference, we often have prior information which is highly relevant to the question being asked; to fail to take it into account is to commit the most obvious inconsistency of reasoning and may lead to absurd or dangerously misleading results. As an extreme examp
Quantum Equilibrium and the Origin of Absolute Uncertainty
, 1992
"... The quantum formalism is a "measurement" formalisma phenomenological formalism describing certain macroscopic regularities. We argue that it can be regarded, and best be understood, as arising from Bohmian mechanics, which is what emerges from Schr6dinger's equation for a system of ..."
Abstract

Cited by 166 (52 self)
 Add to MetaCart
(Show Context)
The quantum formalism is a "measurement" formalisma phenomenological formalism describing certain macroscopic regularities. We argue that it can be regarded, and best be understood, as arising from Bohmian mechanics, which is what emerges from Schr6dinger's equation for a system of particles when we merely insist that "particles " means particles. While distinctly nonNewtonian, Bohmian mechanics is a fully deterministic theory of particles in motion, a motion choreographed by the wave function. We find that a Bohmian universe, though deterministic, evolves in such a manner that an appearance of randomness emerges, precisely as described by the quantum formalism and given, for example, by "p = IV [ 2.,, A crucial ingredient in our analysis of the origin of this randomness is the notion of the effective wave function of a subsystem, a notion of interest in its own right and of relevance to any discussion of quantum theory. When the quantum formalism is regarded as arising in this way, the paradoxes and perplexities so often associated with (nonrelativistic) quantum theory simply evaporate.
Nonequilibrium critical phenomena and phase transitions into absorbing states
 ADVANCES IN PHYSICS
, 2000
"... ..."
The estimation of probabilities
, 1965
"... By way of introduction, a classification of kinds of probability is given in the form of a tree which also forms an approximate hierarchy: psychological, subjective, logical, physical, and tautological. Various relationships between these kinds of probability are mentioned. Methods, all more or less ..."
Abstract

Cited by 94 (2 self)
 Add to MetaCart
By way of introduction, a classification of kinds of probability is given in the form of a tree which also forms an approximate hierarchy: psychological, subjective, logical, physical, and tautological. Various relationships between these kinds of probability are mentioned. Methods, all more or less Bayesian, for the estimation of physical probabilities are then described. Binomial and multinomial probabilities are estimated by means of a threetiered hierarchical Bayesian method. The method can also be regarded, in some of its aspects, as Bayesian in the ordinary sense, wherein the initial distribution for the physical probabilities is a weighted sum of symmetrical Dirichlet distributions. It can be proved that this is equivalent to the use of a single symmetrical Dirichlet distribution whose parameter is selected after sampling. Thus, in this problem, an ordinary Bayesian method implies an empirical Bayesian method. Next the speciessampling or vocabularysampling problem is considered wherein there is a multinomial population having a very large number of categories. The theory derives from a suggestion of Turing's, which in part anticipates the empirical Bayesian method. Among other things, it leads to estimates of population coverage for an enlarged sample. Finally the estimation of probabilities in multidimensional contingency tables is considered. The main method here is that of maximum entropy, but it is shown that this can be subsumed under a more general method of minimum discriminability for the formulation of hypotheses. Entropy is best regarded as a special case of the older and more obviously Bayesian concept of "expected weight of evidence". 1.
Chain Graph Models and their Causal Interpretations
 B
, 2001
"... Chain graphs are a natural generalization of directed acyclic graphs (DAGs) and undirected graphs. However, the apparent simplicity of chain graphs belies the subtlety of the conditional independence hypotheses that they represent. There are a number of simple and apparently plausible, but ultim ..."
Abstract

Cited by 68 (7 self)
 Add to MetaCart
(Show Context)
Chain graphs are a natural generalization of directed acyclic graphs (DAGs) and undirected graphs. However, the apparent simplicity of chain graphs belies the subtlety of the conditional independence hypotheses that they represent. There are a number of simple and apparently plausible, but ultimately fallacious interpretations of chain graphs that are often invoked, implicitly or explicitly. These interpretations also lead to awed methods for applying background knowledge to model selection. We present a valid interpretation by showing how the distribution corresponding to a chain graph may be generated as the equilibrium distribution of dynamic models with feedback. These dynamic interpretations lead to a simple theory of intervention, extending the theory developed for DAGs. Finally, we contrast chain graph models under this interpretation with simultaneous equation models which have traditionally been used to model feedback in econometrics. Keywords: Causal model; cha...
Boltzmann's Approach to Statistical Mechanics
 IN: CHANCE IN PHYSICS, FOUNDATIONS
, 2002
"... In the last quarter of the nineteenth century, Ludwig Boltzmann explained how irreversible macroscopic laws, in particular the second law of thermodynamics, originate in the timereversible laws of microscopic physics. Boltzmann's analysis, the essence of which I shall review here, is basicall ..."
Abstract

Cited by 56 (4 self)
 Add to MetaCart
(Show Context)
In the last quarter of the nineteenth century, Ludwig Boltzmann explained how irreversible macroscopic laws, in particular the second law of thermodynamics, originate in the timereversible laws of microscopic physics. Boltzmann's analysis, the essence of which I shall review here, is basically correct. The most famous criticisms of Boltzmann's later work on the subject have little merit. Most twentieth century innovations  such as the identification of the state of a physical system with a probability distribution # on its phase space, of its thermodynamic entropy with the Gibbs entropy of #, and the invocation of the notions of ergodicity and mixing for the justification of the foundations of statistical mechanics  are thoroughly misguided.
Variational PDE models in image processing
, 2002
"... This paper is based on a plenary presentation given by Tony F. Chan at the 2002 Joint Mathematical Meeting, San Diego, and has been supported in part by NSF under grant numbers DMS9973341 (Chan), DMS0202565 (Shen), and ITR0113439 (Vese), by ONR under N000140210015 (Chan), and by NIH under NIH ..."
Abstract

Cited by 46 (11 self)
 Add to MetaCart
This paper is based on a plenary presentation given by Tony F. Chan at the 2002 Joint Mathematical Meeting, San Diego, and has been supported in part by NSF under grant numbers DMS9973341 (Chan), DMS0202565 (Shen), and ITR0113439 (Vese), by ONR under N000140210015 (Chan), and by NIH under NIHP20MH65166 (Chan and Vese). For the preprints and reprints mentioned in this paper, please visit our web site at: www.math.ucla.edu/~imagers. Chan and Vese are with the Department of Mathematics, UCLA, Los Angeles, CA 90095, fchan, lveseg@math.ucla.edu; Shen is with the School of Mathematics, University of Minnesota, Minneapolis, MN 55455, jhshen@math.umn.edu
Bayesian Methods: General Background
, 1986
"... : We note the main points of history, as a framework on which to hang many background remarks concerning the nature and motivation of Bayesian/Maximum Entropy methods. Experience has shown that these are needed in order to understand recent work and problems. A more complete account of the history, ..."
Abstract

Cited by 44 (1 self)
 Add to MetaCart
: We note the main points of history, as a framework on which to hang many background remarks concerning the nature and motivation of Bayesian/Maximum Entropy methods. Experience has shown that these are needed in order to understand recent work and problems. A more complete account of the history, with many more details and references, is given in Jaynes (1978). The following discussion is essentially nontechnical; the aim is only to convey a little introductory "feel" for our outlook, purpose, and terminology, and to alert newcomers to common pitfalls of misunderstanding. HERODOTUS 2 BERNOULLI 2 BAYES 4 LAPLACE 5 JEFFREYS 6 COX 8 SHANNON 9 COMMUNICATION DIFFICULTIES 10 IS OUR LOGIC OPEN OR CLOSED? 13 DOWNWARD ANALYSIS IN STATISTICAL MECHANICS 14 CURRENT PROBLEMS 15 REFERENCES 17 ? Presented at the Fourth Annual Workshop on Bayesian/Maximum Entropy Methods, University of Calgary, August 1984. In the Proceedings Volume, Maximum Entropy and Bayesian Methods in Applied Statistics, J. H....
Statistical Modeling and Conceptualization of Visual Patterns
, 2003
"... Natural images contain an overwhelming number of visual patterns generated by diverse stochastic processes. Defining and modeling these patterns is of fundamental importance for generic vision tasks, such as perceptual organization, segmentation, and recognition. The objective of this epistemologi ..."
Abstract

Cited by 43 (4 self)
 Add to MetaCart
Natural images contain an overwhelming number of visual patterns generated by diverse stochastic processes. Defining and modeling these patterns is of fundamental importance for generic vision tasks, such as perceptual organization, segmentation, and recognition. The objective of this epistemological paper is to summarize various threads of research in the literature and to pursue a unified framework for conceptualization, modeling, learning, and computing visual patterns. This paper starts with reviewing four research streams: 1) the study of image statistics, 2) the analysis of image components, 3) the grouping of image elements, and 4) the modeling of visual patterns. The models from these research streams are then divided into four categories according to their semantic structures: 1) descriptive models, i.e., Markov random fields (MRF) or Gibbs, 2) variants of descriptive models (causal MRF and "pseudodescriptive" models), 3) generative models, and 4) discriminative models. The objectives, principles, theories, and typical models are reviewed in each category and the relationships between the four types of models are studied. Two central themes emerge from the relationship studies.