Results 1  10
of
23
What is a Random Sequence
 The Mathematical Association of America, Monthly
, 2002
"... there laws of randomness? These old and deep philosophical questions still stir controversy today. Some scholars have suggested that our difficulty in dealing with notions of randomness could be gauged by the comparatively late development of probability theory, which had a ..."
Abstract

Cited by 9 (1 self)
 Add to MetaCart
(Show Context)
there laws of randomness? These old and deep philosophical questions still stir controversy today. Some scholars have suggested that our difficulty in dealing with notions of randomness could be gauged by the comparatively late development of probability theory, which had a
On the relation between quantum mechanical probabilities and event frequencies
 Annals of Physics 313 (2004) 368 quantph/0403207
"... event frequencies ..."
(Show Context)
String Pattern Matching For A Deluge Survival Kit
, 2000
"... String Pattern Matching concerns itself with algorithmic and combinatorial issues related to matching and searching on linearly arranged sequences of symbols, arguably the simplest possible discrete structures. As unprecedented volumes of sequence data are amassed, disseminated and shared at an incr ..."
Abstract

Cited by 6 (1 self)
 Add to MetaCart
String Pattern Matching concerns itself with algorithmic and combinatorial issues related to matching and searching on linearly arranged sequences of symbols, arguably the simplest possible discrete structures. As unprecedented volumes of sequence data are amassed, disseminated and shared at an increasing pace, effective access to, and manipulation of such data depend crucially on the efficiency with which strings are structured, compressed, transmitted, stored, searched and retrieved. This paper samples from this perspective, and with the authors' own bias, a rich arsenal of ideas and techniques developed in more than three decades of history.
A Calculator for Confidence Intervals
 Comput. Phys. Commun. 149
, 2002
"... A calculator program has been written to give confidence intervals on branching ratios for rare decay modes (or similar quantities) calculated from the number of events observed, the acceptance factor, the background estimate and the associated errors. Results from different experiments (or differen ..."
Abstract

Cited by 5 (0 self)
 Add to MetaCart
A calculator program has been written to give confidence intervals on branching ratios for rare decay modes (or similar quantities) calculated from the number of events observed, the acceptance factor, the background estimate and the associated errors. Results from different experiments (or different channels from the same experiment) can be combined. The calculator is available in
On the consistency of PCR6 with the averaging rule and its application to probability estimation
 in Proc. of Fusion 2013
"... Abstract—Since the development of belief function theory introduced by Shafer in seventies, many combination rules have been proposed in the literature to combine belief functions specially (but not only) in high conflicting situations because the emblematic Dempster’s rule generates counterintuiti ..."
Abstract

Cited by 5 (4 self)
 Add to MetaCart
(Show Context)
Abstract—Since the development of belief function theory introduced by Shafer in seventies, many combination rules have been proposed in the literature to combine belief functions specially (but not only) in high conflicting situations because the emblematic Dempster’s rule generates counterintuitive and unacceptable results in practical applications. Many attempts have been done during last thirty years to propose better rules of combination based on different frameworks and justifications. Recently in the DSmT (DezertSmarandache Theory) framework, two interesting and sophisticate rules (PCR5 and PCR6 rules) have been proposed based on the Proportional Conflict Redistribution (PCR) principle. These two rules coincide for the combination of two basic belief assignments, but they differ in general as soon as three or more sources have to be combined altogether because the PCR used in PCR5 and in PCR6 are different. In this paper we show why PCR6 is better than PCR5 to combine three or more sources of evidence and we prove the coherence of PCR6 with the simple Averaging Rule used classically to estimate the probability based on the frequentist interpretation of the probability measure. We show that such probability estimate cannot be obtained using DempsterShafer (DS) rule, nor PCR5 rule.
Probabilities are singlecase, or nothing
 Optics and Spectroscopy
, 2005
"... Physicists have, hitherto, mostly adopted a frequentist conception of probability, according to which probability statements apply only to ensembles. It is argued that we should, instead, adopt an epistemic, or Bayesian conception, in which probabilities are conceived as logical constructs rather th ..."
Abstract

Cited by 4 (0 self)
 Add to MetaCart
(Show Context)
Physicists have, hitherto, mostly adopted a frequentist conception of probability, according to which probability statements apply only to ensembles. It is argued that we should, instead, adopt an epistemic, or Bayesian conception, in which probabilities are conceived as logical constructs rather than physical realities, and in which probability statements do apply directly to individual events. The question is closely related to the disagreement between the orthodox school of statistical thought and the Bayesian school. It has important technical implications (it makes a difference, what statistical methodology one adopts). It may also have important implications for the interpretation of the quantum state. 1 1.
CONCERNING DICE AND DIVINITY
, 2006
"... Einstein initially objected to the probabilistic aspect of quantum mechanics— the idea that God is playing at dice. Later he changed his ground, and focussed instead on the point that the Copenhagen Interpretation leads to what Einstein saw as the abandonment of physical realism. We argue here that ..."
Abstract

Cited by 4 (0 self)
 Add to MetaCart
(Show Context)
Einstein initially objected to the probabilistic aspect of quantum mechanics— the idea that God is playing at dice. Later he changed his ground, and focussed instead on the point that the Copenhagen Interpretation leads to what Einstein saw as the abandonment of physical realism. We argue here that Einstein’s initial intuition was perfectly sound, and that it is precisely the fact that quantum mechanics is a fundamentally probabilistic theory which is at the root of all the controversies regarding its interpretation. Probability is an intrinsically logical concept. This means that the quantum state has an essentially logical significance. It is extremely difficult to reconcile that fact with Einstein’s belief, that it is the task of physics to give us a vision of the world apprehended sub specie aeternitatis. Quantum mechanics thus presents us with a simple choice: either to follow Einstein in looking for a theory which is not probabilistic at the fundamental level, or else to accept that physics does not in fact put us in the position of God looking down on things from above. There is a widespread fear that the latter alternative must inevitably lead to a greatly impoverished,
Facts, Values and Quanta
, 2005
"... Quantum mechanics is a fundamentally probabilistic theory (at least so far as the empirical predictions are concerned). It follows that, if one wants to properly understand quantum mechanics, it is essential to clearly understand the meaning of probability statements. The interpretation of probabili ..."
Abstract

Cited by 3 (0 self)
 Add to MetaCart
(Show Context)
Quantum mechanics is a fundamentally probabilistic theory (at least so far as the empirical predictions are concerned). It follows that, if one wants to properly understand quantum mechanics, it is essential to clearly understand the meaning of probability statements. The interpretation of probability has excited nearly as much philosophical controversy as the interpretation of quantum mechanics. 20th century physicists have mostly adopted a frequentist conception. In this paper it is argued that we ought, instead, to adopt a logical or Bayesian conception. The paper includes a comparison of the orthodox and Bayesian theories of statistical inference. It concludes with a few remarks concerning the implications for the concept of physical reality.
Role And Meaning Of Subjective Probability Some Comments On Common Misconceptions
, 2000
"... . Criticisms of so called `subjective probability' come on the one hand from those who maintain that probability in physics has only a frequentistic interpretation, and, on the other, from those who tend to `objectivise' Bayesian theory, arguing, e.g., that subjective probabilities are ind ..."
Abstract

Cited by 2 (0 self)
 Add to MetaCart
. Criticisms of so called `subjective probability' come on the one hand from those who maintain that probability in physics has only a frequentistic interpretation, and, on the other, from those who tend to `objectivise' Bayesian theory, arguing, e.g., that subjective probabilities are indeed based `only on private introspection '. Some of the common misconceptions on subjective probability will be commented upon in support of the thesis that coherence is the most crucial, universal and `objective' way to assess our confidence on events of any kind. Key words: Subjective Bayesian Theory, Measurement Uncertainty 1. Introduction The role of scientists is, generally speaking, to understand Nature, in order to forecast as yet unobserved (`future') events, independently of whether or not these events can be influenced. In laboratory experiments and all technological applications, observations depend on our intentional manipulation of the external world. However, other scientific activiti...
THE DEVELOPMENT OF SUBJECTIVE BAYESIANISM
"... The Bayesian approach to inductive reasoning originated in two brilliant insights. In 1654 Blaise Pascal, while in the course of a correspondence with Fermat [1769], recognized that states of uncertainty can be quantified using probabilities and expectations. In the early 1760s Thomas Bayes [1763] ..."
Abstract
 Add to MetaCart
The Bayesian approach to inductive reasoning originated in two brilliant insights. In 1654 Blaise Pascal, while in the course of a correspondence with Fermat [1769], recognized that states of uncertainty can be quantified using probabilities and expectations. In the early 1760s Thomas Bayes [1763] first understood that learning can be represented probabilistically using what is now called Bayes’s Theorem. These ideas serve as the basis for all Bayesian thought. 1.1 Pascal’s Insights: Probability and Expectation In modern terms, Pascal’s insight is that uncertainty about the occurrence of an event can be expressed as a probability and, more generally, that uncertainty about the value of a quantity can be expressed as a mathematical expectation. The basic objects of uncertainty can be thought as propositions or events in a nonempty Boolean algebra Ω that is closed under negation and countable disjunction. A probability function on Ω is a mapping P of Ω into real numbers that obeys these laws: