Results 1  10
of
155
Prior Probabilities
 IEEE Transactions on Systems Science and Cybernetics
, 1968
"... e case of location and scale parameters, rate constants, and in Bernoulli trials with unknown probability of success. In realistic problems, both the transformation group analysis and the principle of maximum entropy are needed to determine the prior. The distributions thus found are uniquely determ ..."
Abstract

Cited by 165 (3 self)
 Add to MetaCart
e case of location and scale parameters, rate constants, and in Bernoulli trials with unknown probability of success. In realistic problems, both the transformation group analysis and the principle of maximum entropy are needed to determine the prior. The distributions thus found are uniquely determined by the prior information, independently of the choice of parameters. In a certain class of problems, therefore, the prior distributions may now be claimed to be fully as "objective" as the sampling distributions. I. Background of the problem Since the time of Laplace, applications of probability theory have been hampered by difficulties in the treatment of prior information. In realistic problems of decision or inference, we often have prior information which is highly relevant to the question being asked; to fail to take it into account is to commit the most obvious inconsistency of reasoning and may lead to absurd or dangerously misleading results. As an extreme examp
Quantum Equilibrium and the Origin of Absolute Uncertainty
, 1992
"... The quantum formalism is a "measurement" formalisma phenomenological formalism describing certain macroscopic regularities. We argue that it can be regarded, and best be understood, as arising from Bohmian mechanics, which is what emerges from Schr6dinger's equation for a system of particles when ..."
Abstract

Cited by 112 (47 self)
 Add to MetaCart
The quantum formalism is a "measurement" formalisma phenomenological formalism describing certain macroscopic regularities. We argue that it can be regarded, and best be understood, as arising from Bohmian mechanics, which is what emerges from Schr6dinger's equation for a system of particles when we merely insist that "particles " means particles. While distinctly nonNewtonian, Bohmian mechanics is a fully deterministic theory of particles in motion, a motion choreographed by the wave function. We find that a Bohmian universe, though deterministic, evolves in such a manner that an appearance of randomness emerges, precisely as described by the quantum formalism and given, for example, by "p = IV [ 2.,, A crucial ingredient in our analysis of the origin of this randomness is the notion of the effective wave function of a subsystem, a notion of interest in its own right and of relevance to any discussion of quantum theory. When the quantum formalism is regarded as arising in this way, the paradoxes and perplexities so often associated with (nonrelativistic) quantum theory simply evaporate.
Nonequilibrium critical phenomena and phase transitions into absorbing states
 ADVANCES IN PHYSICS
, 2000
"... ..."
Chain Graph Models and their Causal Interpretations
 B
, 2001
"... Chain graphs are a natural generalization of directed acyclic graphs (DAGs) and undirected graphs. However, the apparent simplicity of chain graphs belies the subtlety of the conditional independence hypotheses that they represent. There are a number of simple and apparently plausible, but ultim ..."
Abstract

Cited by 46 (4 self)
 Add to MetaCart
Chain graphs are a natural generalization of directed acyclic graphs (DAGs) and undirected graphs. However, the apparent simplicity of chain graphs belies the subtlety of the conditional independence hypotheses that they represent. There are a number of simple and apparently plausible, but ultimately fallacious interpretations of chain graphs that are often invoked, implicitly or explicitly. These interpretations also lead to awed methods for applying background knowledge to model selection. We present a valid interpretation by showing how the distribution corresponding to a chain graph may be generated as the equilibrium distribution of dynamic models with feedback. These dynamic interpretations lead to a simple theory of intervention, extending the theory developed for DAGs. Finally, we contrast chain graph models under this interpretation with simultaneous equation models which have traditionally been used to model feedback in econometrics. Keywords: Causal model; cha...
Boltzmann's Approach to Statistical Mechanics
 IN: CHANCE IN PHYSICS, FOUNDATIONS
, 2002
"... In the last quarter of the nineteenth century, Ludwig Boltzmann explained how irreversible macroscopic laws, in particular the second law of thermodynamics, originate in the timereversible laws of microscopic physics. Boltzmann's analysis, the essence of which I shall review here, is basically cor ..."
Abstract

Cited by 36 (3 self)
 Add to MetaCart
In the last quarter of the nineteenth century, Ludwig Boltzmann explained how irreversible macroscopic laws, in particular the second law of thermodynamics, originate in the timereversible laws of microscopic physics. Boltzmann's analysis, the essence of which I shall review here, is basically correct. The most famous criticisms of Boltzmann's later work on the subject have little merit. Most twentieth century innovations  such as the identification of the state of a physical system with a probability distribution # on its phase space, of its thermodynamic entropy with the Gibbs entropy of #, and the invocation of the notions of ergodicity and mixing for the justification of the foundations of statistical mechanics  are thoroughly misguided.
Bayesian Methods: General Background
, 1986
"... : We note the main points of history, as a framework on which to hang many background remarks concerning the nature and motivation of Bayesian/Maximum Entropy methods. Experience has shown that these are needed in order to understand recent work and problems. A more complete account of the history, ..."
Abstract

Cited by 36 (1 self)
 Add to MetaCart
: We note the main points of history, as a framework on which to hang many background remarks concerning the nature and motivation of Bayesian/Maximum Entropy methods. Experience has shown that these are needed in order to understand recent work and problems. A more complete account of the history, with many more details and references, is given in Jaynes (1978). The following discussion is essentially nontechnical; the aim is only to convey a little introductory "feel" for our outlook, purpose, and terminology, and to alert newcomers to common pitfalls of misunderstanding. HERODOTUS 2 BERNOULLI 2 BAYES 4 LAPLACE 5 JEFFREYS 6 COX 8 SHANNON 9 COMMUNICATION DIFFICULTIES 10 IS OUR LOGIC OPEN OR CLOSED? 13 DOWNWARD ANALYSIS IN STATISTICAL MECHANICS 14 CURRENT PROBLEMS 15 REFERENCES 17 ? Presented at the Fourth Annual Workshop on Bayesian/Maximum Entropy Methods, University of Calgary, August 1984. In the Proceedings Volume, Maximum Entropy and Bayesian Methods in Applied Statistics, J. H....
An Emulator Network for
 SIMD Machine Interconnection Networks, in: Proc. 6 th annual symposium on Computer architecture
, 1979
"... Fig. 0.1. [Proposed cover figure.] The largest connected component of a network of network scientists. This network was constructed based on the coauthorship of papers listed in two wellknown review articles [13,83] and a small number of additional papers that were added manually [86]. Each node is ..."
Abstract

Cited by 36 (3 self)
 Add to MetaCart
Fig. 0.1. [Proposed cover figure.] The largest connected component of a network of network scientists. This network was constructed based on the coauthorship of papers listed in two wellknown review articles [13,83] and a small number of additional papers that were added manually [86]. Each node is colored according to community membership, which was determined using a leadingeigenvector spectral method followed by KernighanLin nodeswapping steps [64, 86, 107]. To determine community placement, we used the FruchtermanReingold graph visualization [45], a forcedirected layout method that is related to maximizing a quality function known as modularity [92]. To apply this method, we treated the communities as if they were themselves the nodes of a (significantly smaller) network with connections rescaled by intercommunity links. We then used the KamadaKawaii springembedding graph visualization algorithm [62] to place the nodes of each individual community (ignoring intercommunity links) and then to rotate and flip the communities for optimal placement (including intercommunity links). We gratefully acknowledge Amanda Traud for preparing this figure. COMMUNITIES IN NETWORKS
Variational PDE models in image processing
, 2002
"... This paper is based on a plenary presentation given by Tony F. Chan at the 2002 Joint Mathematical Meeting, San Diego, and has been supported in part by NSF under grant numbers DMS9973341 (Chan), DMS0202565 (Shen), and ITR0113439 (Vese), by ONR under N000140210015 (Chan), and by NIH under NIH ..."
Abstract

Cited by 34 (11 self)
 Add to MetaCart
This paper is based on a plenary presentation given by Tony F. Chan at the 2002 Joint Mathematical Meeting, San Diego, and has been supported in part by NSF under grant numbers DMS9973341 (Chan), DMS0202565 (Shen), and ITR0113439 (Vese), by ONR under N000140210015 (Chan), and by NIH under NIHP20MH65166 (Chan and Vese). For the preprints and reprints mentioned in this paper, please visit our web site at: www.math.ucla.edu/~imagers. Chan and Vese are with the Department of Mathematics, UCLA, Los Angeles, CA 90095, fchan, lveseg@math.ucla.edu; Shen is with the School of Mathematics, University of Minnesota, Minneapolis, MN 55455, jhshen@math.umn.edu
Statistical Modeling and Conceptualization of Visual Patterns
, 2003
"... Natural images contain an overwhelming number of visual patterns generated by diverse stochastic processes. Defining and modeling these patterns is of fundamental importance for generic vision tasks, such as perceptual organization, segmentation, and recognition. The objective of this epistemologi ..."
Abstract

Cited by 29 (3 self)
 Add to MetaCart
Natural images contain an overwhelming number of visual patterns generated by diverse stochastic processes. Defining and modeling these patterns is of fundamental importance for generic vision tasks, such as perceptual organization, segmentation, and recognition. The objective of this epistemological paper is to summarize various threads of research in the literature and to pursue a unified framework for conceptualization, modeling, learning, and computing visual patterns. This paper starts with reviewing four research streams: 1) the study of image statistics, 2) the analysis of image components, 3) the grouping of image elements, and 4) the modeling of visual patterns. The models from these research streams are then divided into four categories according to their semantic structures: 1) descriptive models, i.e., Markov random fields (MRF) or Gibbs, 2) variants of descriptive models (causal MRF and "pseudodescriptive" models), 3) generative models, and 4) discriminative models. The objectives, principles, theories, and typical models are reviewed in each category and the relationships between the four types of models are studied. Two central themes emerge from the relationship studies.