• Documents
  • Authors
  • Tables
  • Log in
  • Sign up
  • MetaCart
  • DMCA
  • Donate

CiteSeerX logo

Advanced Search Include Citations

Tools

Sorted by:
Try your query at:
Semantic Scholar Scholar Academic
Google Bing DBLP
Results 1 - 10 of 4,307
Next 10 →

probabilistic theories

by A. R. Plastino , 2009
"... Fidelity measure and conservation of information in general ..."
Abstract - Add to MetaCart
Fidelity measure and conservation of information in general

Probabilistic Theories of Causality

by Jon Williamson - IN THE OXFORD HANDBOOK OF CAUSATION , 2009
"... This chapter provides an overview of a range of probabilistic theories of causality, including those of Reichenbach, Good and Suppes, and the contemporary causal net approach. It discusses two key problems for probabilistic accounts: counterexamples to these theories and their failure to account for ..."
Abstract - Cited by 8 (7 self) - Add to MetaCart
This chapter provides an overview of a range of probabilistic theories of causality, including those of Reichenbach, Good and Suppes, and the contemporary causal net approach. It discusses two key problems for probabilistic accounts: counterexamples to these theories and their failure to account

A Probabilistic Theory of Clustering

by Edward R. Dougherty, Marcel Brun , 2004
"... clustering is typically considered a subjective process, which makes it problematic. For instance, how does one make statistical inferences based onclustering The matter is di#erent with pattern classi#cation, for which two fundamental characteristics can be stated: (1) the error of a classi#er c ..."
Abstract - Cited by 13 (3 self) - Add to MetaCart
#er can be estimatedusing "test data," and (2) a classi#er can be learnedusing "training data." This paper presents a probabilistic theory ofclustering including bothlearning (training and error estimation (testingb The theory is based on operators on random labeled point

Probabilistic Inference Using Markov Chain Monte Carlo Methods

by Radford M. Neal , 1993
"... Probabilistic inference is an attractive approach to uncertain reasoning and empirical learning in artificial intelligence. Computational difficulties arise, however, because probabilistic models with the necessary realism and flexibility lead to complex distributions over high-dimensional spaces. R ..."
Abstract - Cited by 736 (24 self) - Add to MetaCart
for approximate counting of large sets. In this review, I outline the role of probabilistic inference in artificial intelligence, present the theory of Markov chains, and describe various Markov chain Monte Carlo algorithms, along with a number of supporting techniques. I try to present a comprehensive picture

Symmetry and composition in probabilistic theories

by Alexander Wilce - ENTCS , 2011
"... ar ..."
Abstract - Cited by 5 (4 self) - Add to MetaCart
Abstract not found

Inferring Probabilistic Theories from

by Edwin P. D. Peduault
"... When formulating a theory based on observations influenced by noise or other sources of uncertainty, it becomes necessary to decide whether the pro-posed theory agrees with the data “well enough.” This paper presents a criterion for making this judgement. The criterion is based on a gambling scenari ..."
Abstract - Add to MetaCart
When formulating a theory based on observations influenced by noise or other sources of uncertainty, it becomes necessary to decide whether the pro-posed theory agrees with the data “well enough.” This paper presents a criterion for making this judgement. The criterion is based on a gambling

Generalised probabilistic theories . . .

by Samuel Fiorini, et al. , 2014
"... ..."
Abstract - Add to MetaCart
Abstract not found

Probabilistic Theories of the Visual Cortex

by A. L. Yuille, D. Kersten
"... THE VERY EARLY VISUAL SYSTEM This lecture first briefly reviews the structural organization of V1, the properties of simple cells, and divisive normalization. The lecture also illustrated principles such as sparsity, independence, and inverting generative models. A. Review: From Retina and LGN to V1 ..."
Abstract - Add to MetaCart
THE VERY EARLY VISUAL SYSTEM This lecture first briefly reviews the structural organization of V1, the properties of simple cells, and divisive normalization. The lecture also illustrated principles such as sparsity, independence, and inverting generative models. A. Review: From Retina and LGN to V1 Light is captured in the retina, transmitted to the LGN, and then to area V1 of the visual cortex. Receptive field properties of neurons in retina and LGN are generally believed to be modelled by symmetric centersurround cells – i.e. the Laplacian of a Gaussian filter, which looks like a Mexican Hat. This may be an over-simplification (e.g., see meister for an alternative viewpoint) but Yang Dan reports that it is possible to reconstruct the input image from the responses of neurons in retina or LGB (which would seem to be impossible if the standard models were badly wrong). There is an expansion (by a factor between 80 and 400) as we move from the LGN to V1. This is not surprising because V1 starts the hard problem of interpreting the image (while the retina and LGN perform the simpler tasks of capturing the image and transmitting it to the cortex – at least this is the

COMPATIBILITY FOR PROBABILISTIC THEORIES

by Stan Gudder
"... iv ..."
Abstract - Add to MetaCart
Abstract not found

Generalized Probabilistic Theories [1]

by Peter Janotta, Christian Gogolin, Jonathan Barrett, Nicolas Brunner
"... on non-local correlations from the structure of the local state space ..."
Abstract - Add to MetaCart
on non-local correlations from the structure of the local state space
Next 10 →
Results 1 - 10 of 4,307
Powered by: Apache Solr
  • About CiteSeerX
  • Submit and Index Documents
  • Privacy Policy
  • Help
  • Data
  • Source
  • Contact Us

Developed at and hosted by The College of Information Sciences and Technology

© 2007-2019 The Pennsylvania State University