Results 1  10
of
58,991
Risk as Feelings
, 2001
"... Virtually all current theories of choice under risk or uncertainty are cognitive and consequentialist. They assume that people assess the desirability and likelihood of possible outcomes of choice alternatives and integrate this information through some type of expectationbased calculus to arrive a ..."
Abstract

Cited by 501 (21 self)
 Add to MetaCart
Virtually all current theories of choice under risk or uncertainty are cognitive and consequentialist. They assume that people assess the desirability and likelihood of possible outcomes of choice alternatives and integrate this information through some type of expectationbased calculus to arrive
Risk and protective factors for alcohol and other drug problems in adolescence and early adulthood: Implications for substance abuse prevention
 Psychological Bulletin
, 1992
"... The authors suggest that the most promising route to effective strategies for the prevention of adolescent alcohol and other drug problems is through a riskfocused approach. This approach requires the identification of risk factors for drug abuse, identification of methods by which risk factors hav ..."
Abstract

Cited by 725 (18 self)
 Add to MetaCart
have been effectively addressed, and application of these methods to appropriate highrisk and general population samples in controlled studies. The authors review risk and protective factors for drug abuse, assess a number of approaches for drug abuse prevention potential with highrisk groups
Genomic control for association studies
, 1999
"... A dense set of single nucleotide polymorphisms (SNP) covering the genome and an efficient method to assess SNP genotypes are expected to be available in the near future. An outstanding question is how to use these technologies efficiently to identify genes affecting liability to complex disorders. ..."
Abstract

Cited by 480 (13 self)
 Add to MetaCart
A dense set of single nucleotide polymorphisms (SNP) covering the genome and an efficient method to assess SNP genotypes are expected to be available in the near future. An outstanding question is how to use these technologies efficiently to identify genes affecting liability to complex disorders
Power provisioning for a warehousesized computer,”
 ACM SIGARCH Computer Architecture News,
, 2007
"... ABSTRACT Largescale Internet services require a computing infrastructure that can be appropriately described as a warehousesized computing system. The cost of building datacenter facilities capable of delivering a given power capacity to such a computer can rival the recurring energy consumption ..."
Abstract

Cited by 450 (2 self)
 Add to MetaCart
collections of servers (up to 15 thousand) for different classes of applications over a period of approximately six months. Those observations allow us to evaluate opportunities for maximizing the use of the deployed power capacity of datacenters, and assess the risks of oversubscribing it. We find that even
Robust Distributed Network Localization with Noisy Range Measurements
, 2004
"... This paper describes a distributed, lineartime algorithm for localizing sensor network nodes in the presence of range measurement noise and demonstrates the algorithm on a physical network. We introduce the probabilistic notion of robust quadrilaterals as a way to avoid flip ambiguities that otherw ..."
Abstract

Cited by 403 (20 self)
 Add to MetaCart
that otherwise corrupt localization computations. We formulate the localization problem as a twodimensional graph realization problem: given a planar graph with approximately known edge lengths, recover the Euclidean position of each vertex up to a global rotation and translation. This formulation is applicable
The minimum description length principle in coding and modeling
 IEEE TRANS. INFORM. THEORY
, 1998
"... We review the principles of Minimum Description Length and Stochastic Complexity as used in data compression and statistical modeling. Stochastic complexity is formulated as the solution to optimum universal coding problems extending Shannon’s basic source coding theorem. The normalized maximized ..."
Abstract

Cited by 394 (18 self)
 Add to MetaCart
We review the principles of Minimum Description Length and Stochastic Complexity as used in data compression and statistical modeling. Stochastic complexity is formulated as the solution to optimum universal coding problems extending Shannon’s basic source coding theorem. The normalized maximized
Measuring security price performance
 Journal of Financial Economics
, 1980
"... Event studies focus on the impact of particular types of firmspecific events on the prices of the affected firms ’ securities. In this paper, observed stock return data are employed to examine various methodologies which are used 111 event studies to measure security price performance. Abnormal per ..."
Abstract

Cited by 379 (2 self)
 Add to MetaCart
performance is Introduced into this data. We find that a simple methodology based on the market model performs well under a wide variety of conditions. In some situations, even simpler methods which do not explicitly adjust for marketwide factors or for risk perform no worse than the market model. We also
Posterior Predictive Assessment of Model Fitness Via Realized Discrepancies
 Statistica Sinica
, 1996
"... Abstract: This paper considers Bayesian counterparts of the classical tests for goodness of fit and their use in judging the fit of a single Bayesian model to the observed data. We focus on posterior predictive assessment, in a framework that also includes conditioning on auxiliary statistics. The B ..."
Abstract

Cited by 348 (39 self)
 Add to MetaCart
. The Bayesian formulation facilitates the construction and calculation of a meaningful reference distribution not only for any (classical) statistic, but also for any parameterdependent “statistic ” or discrepancy. The latter allows us to propose the realized discrepancy assessment of model fitness, which
Support Vector Machines for Classification and Regression
 UNIVERSITY OF SOUTHAMPTON, TECHNICAL REPORT
, 1998
"... The problem of empirical data modelling is germane to many engineering applications.
In empirical data modelling a process of induction is used to build up a model of the
system, from which it is hoped to deduce responses of the system that have yet to be observed.
Ultimately the quantity and qualit ..."
Abstract

Cited by 357 (5 self)
 Add to MetaCart
for parameter selection and the statistical measures used
to select the ’best’ model. The foundations of Support Vector Machines (SVM) have
been developed by Vapnik (1995) and are gaining popularity due to many attractive
features, and promising empirical performance. The formulation embodies the Structural
Curvature of the probability weighting function
 Management Science
, 1996
"... Empirical studies have shown that decision makers do not usually treat probabilities linearly. Instead, people tend to overweight small probabilities and underweight large probabilities. One way to model such distortions in decision making under risk is through a probability weighting function. We ..."
Abstract

Cited by 290 (5 self)
 Add to MetaCart
Empirical studies have shown that decision makers do not usually treat probabilities linearly. Instead, people tend to overweight small probabilities and underweight large probabilities. One way to model such distortions in decision making under risk is through a probability weighting function. We
Results 1  10
of
58,991