Results 1  10
of
416
Some methods for classification and analysis of multivariate observations
 In 5th Berkeley Symposium on Mathematical Statistics and Probability
, 1967
"... The main purpose of this paper is to describe a process for partitioning an Ndimensional population into k sets on the basis of a sample. The process, which is called 'kmeans, ' appears to give partitions which are reasonably ..."
Abstract

Cited by 2070 (1 self)
 Add to MetaCart
(Show Context)
The main purpose of this paper is to describe a process for partitioning an Ndimensional population into k sets on the basis of a sample. The process, which is called 'kmeans, ' appears to give partitions which are reasonably
Almost Everywhere High Nonuniform Complexity
, 1992
"... . We investigate the distribution of nonuniform complexities in uniform complexity classes. We prove that almost every problem decidable in exponential space has essentially maximum circuitsize and spacebounded Kolmogorov complexity almost everywhere. (The circuitsize lower bound actually exceeds ..."
Abstract

Cited by 166 (36 self)
 Add to MetaCart
(Show Context)
. We investigate the distribution of nonuniform complexities in uniform complexity classes. We prove that almost every problem decidable in exponential space has essentially maximum circuitsize and spacebounded Kolmogorov complexity almost everywhere. (The circuitsize lower bound actually exceeds, and thereby strengthens, the Shannon 2 n n lower bound for almost every problem, with no computability constraint.) In exponential time complexity classes, we prove that the strongest relativizable lower bounds hold almost everywhere for almost all problems. Finally, we show that infinite pseudorandom sequences have high nonuniform complexity almost everywhere. The results are unified by a new, more powerful formulation of the underlying measure theory, based on uniform systems of density functions, and by the introduction of a new nonuniform complexity measure, the selective Kolmogorov complexity. This research was supported in part by NSF Grants CCR8809238 and CCR9157382 and in ...
Probabilistic Logic Programming
, 1992
"... Of all scientific investigations into reasoning with uncertainty and chance, probability theory is perhaps the best understood paradigm. Nevertheless, all studies conducted thus far into the semantics of quantitative logic programming (cf. van Emden [51], Fitting [18, 19, 20], Blair and Subrahmanian ..."
Abstract

Cited by 139 (8 self)
 Add to MetaCart
Of all scientific investigations into reasoning with uncertainty and chance, probability theory is perhaps the best understood paradigm. Nevertheless, all studies conducted thus far into the semantics of quantitative logic programming (cf. van Emden [51], Fitting [18, 19, 20], Blair and Subrahmanian [5, 6, 49, 50], Kifer et al [29, 30, 31]) have restricted themselves to nonprobabilistic semantical characterizations. In this paper, we take a few steps towards rectifying this situation. We define a logic programming language that is syntactically similar to the annotated logics of [5, 6], but in which the truth values are interpreted probabilistically. A probabilistic model theory and fixpoint theory is developed for such programs. This probabilistic model theory satisfies the requirements proposed by Fenstad [16] for a function to be called probabilistic. The logical treatment of probabilities is complicated by two facts: first, that the connectives cannot be interpreted truth function...
Model Checking for a Probabilistic Branching Time Logic with Fairness
 Distributed Computing
, 1998
"... We consider concurrent probabilistic systems, based on probabilistic automata of Segala & Lynch [55], which allow nondeterministic choice between probability distributions. These systems can be decomposed into a collection of "computation trees" which arise by resolving the nondeterm ..."
Abstract

Cited by 126 (39 self)
 Add to MetaCart
We consider concurrent probabilistic systems, based on probabilistic automata of Segala & Lynch [55], which allow nondeterministic choice between probability distributions. These systems can be decomposed into a collection of "computation trees" which arise by resolving the nondeterministic, but not probabilistic, choices. The presence of nondeterminism means that certain liveness properties cannot be established unless fairness is assumed. We introduce a probabilistic branching time logic PBTL, based on the logic TPCTL of Hansson [30] and the logic PCTL of [55], resp. pCTL of [14]. The formulas of the logic express properties such as "every request is eventually granted with probability at least p". We give three interpretations for PBTL on concurrent probabilistic processes: the first is standard, while in the remaining two interpretations the branching time quantifiers are taken to range over a certain kind of fair computation trees. We then present a model checking algorithm for...
Complementarity and Nondegeneracy in Semidefinite Programming
, 1995
"... Primal and dual nondegeneracy conditions are defined for semidefinite programming. Given the existence of primal and dual solutions, it is shown that primal nondegeneracy implies a unique dual solution and that dual nondegeneracy implies a unique primal solution. The converses hold if strict complem ..."
Abstract

Cited by 101 (9 self)
 Add to MetaCart
Primal and dual nondegeneracy conditions are defined for semidefinite programming. Given the existence of primal and dual solutions, it is shown that primal nondegeneracy implies a unique dual solution and that dual nondegeneracy implies a unique primal solution. The converses hold if strict complementarity is assumed. Primal and dual nondegeneracy assumptions do not imply strict complementarity, as they do in LP. The primal and dual nondegeneracy assumptions imply a range of possible ranks for primal and dual solutions X and Z. This is in contrast with LP where nondegeneracy assumptions exactly determine the number of variables which are zero. It is shown that primal and dual nondegeneracy and strict complementarity all hold generically. Numerical experiments suggest probability distributions for the ranks of X and Z which are consistent with the nondegeneracy conditions.
Inverse Rendering for Computer Graphics
, 1998
"... Creating realistic images has been a major focus in the study of computer graphics for much of its history. This e ort has led to mathematical models and algorithms that can compute predictive, or physically realistic, images from known camera positions and scene descriptions that include the geomet ..."
Abstract

Cited by 91 (4 self)
 Add to MetaCart
Creating realistic images has been a major focus in the study of computer graphics for much of its history. This e ort has led to mathematical models and algorithms that can compute predictive, or physically realistic, images from known camera positions and scene descriptions that include the geometry of objects, the re ectance of surfaces, and the lighting used to illuminate the scene. These images accurately describe the physical quantities that would be measured from a real scene. Because these algorithms can predict real images, they can also be used in inverse problems to work backward from photographs to attributes of the scene. Work on three such inverse rendering problems is described. The rst, inverse lighting, assumes knowledge of geometry, re ectance, and the recorded photograph and solves for the lighting in the scene. A technique using a linear leastsquares system is proposed and demonstrated. Also demonstrated is an application of inverse lighting, called relighting, which modi es lighting in photographs. The second two inverse rendering problems solve for unknown re ectance, given images with known geometry, lighting, and camera positions. Photographic texture measurement
The quantitative structure of exponential time
 Complexity Theory Retrospective II
, 1997
"... ..."
Perspectives on the Theory and Practice of Belief Functions
 International Journal of Approximate Reasoning
, 1990
"... The theory of belief functions provides one way to use mathematical probability in subjective judgment. It is a generalization of the Bayesian theory of subjective probability. When we use the Bayesian theory to quantify judgments about a question, we must assign probabilities to the possible answer ..."
Abstract

Cited by 91 (7 self)
 Add to MetaCart
(Show Context)
The theory of belief functions provides one way to use mathematical probability in subjective judgment. It is a generalization of the Bayesian theory of subjective probability. When we use the Bayesian theory to quantify judgments about a question, we must assign probabilities to the possible answers to that question. The theory of belief functions is more flexible; it allows us to derive degrees of belief for a question from probabilities for a related question. These degrees of belief may or may not have the mathematical properties of probabilities; how much they differ from probabilities will depend on how closely the two questions are related. Examples of what we would now call belieffunction reasoning can be found in the late seventeenth and early eighteenth centuries, well before Bayesian ideas were developed. In 1689, George Hooper gave rules for combining testimony that can be recognized as special cases of Dempster's rule for combining belief functions (Shafer 1986a). Similar rules were formulated by Jakob Bernoulli in his Ars Conjectandi, published posthumously in 1713, and by JohannHeinrich Lambert in his Neues Organon, published in 1764 (Shafer 1978). Examples of belieffunction reasoning can also be found in more recent work, by authors
Symbolic model checking for probabilistic processes
 IN PROCEEDINGS OF ICALP '97
, 1997
"... We introduce a symbolic model checking procedure for Probabilistic Computation Tree Logic PCTL over labelled Markov chains as models. Model checking for probabilistic logics typically involves solving linear equation systems in order to ascertain the probability of a given formula holding in a stat ..."
Abstract

Cited by 85 (30 self)
 Add to MetaCart
(Show Context)
We introduce a symbolic model checking procedure for Probabilistic Computation Tree Logic PCTL over labelled Markov chains as models. Model checking for probabilistic logics typically involves solving linear equation systems in order to ascertain the probability of a given formula holding in a state. Our algorithm is based on the idea of representing the matrices used in the linear equation systems by MultiTerminal Binary Decision Diagrams (MTBDDs) introduced in Clarke et al [14]. Our procedure, based on the algorithm used by Hansson and Jonsson [24], uses BDDs to represent formulas and MTBDDs to represent Markov chains, and is efficient because it avoids explicit state space construction. A PCTL model checker is being implemented in Verus [9].