Results 1  10
of
1,074,885
Expectations and the Neutrality of Money
 JOURNAL OF ECONOMIC THEORY
, 1972
"... This paper provides a simple example of an economy in which equilibrium prices and quantities exhibit what may be the central feature of the modern business cycle: a systematic relation between the rate of change in nominal prices and the level of real output. The relationship, ..."
Abstract

Cited by 858 (5 self)
 Add to MetaCart
This paper provides a simple example of an economy in which equilibrium prices and quantities exhibit what may be the central feature of the modern business cycle: a systematic relation between the rate of change in nominal prices and the level of real output. The relationship,
Liquidity Risk and Expected Stock Returns
, 2002
"... This study investigates whether marketwide liquidity is a state variable important for asset pricing. We find that expected stock returns are related crosssectionally to the sensitivities of returns to fluctuations in aggregate liquidity. Our monthly liquidity measure, an average of individualsto ..."
Abstract

Cited by 590 (4 self)
 Add to MetaCart
This study investigates whether marketwide liquidity is a state variable important for asset pricing. We find that expected stock returns are related crosssectionally to the sensitivities of returns to fluctuations in aggregate liquidity. Our monthly liquidity measure, an average of individual
Evaluating the Accuracy of SamplingBased Approaches to the Calculation of Posterior Moments
 IN BAYESIAN STATISTICS
, 1992
"... Data augmentation and Gibbs sampling are two closely related, samplingbased approaches to the calculation of posterior moments. The fact that each produces a sample whose constituents are neither independent nor identically distributed complicates the assessment of convergence and numerical accurac ..."
Abstract

Cited by 583 (14 self)
 Add to MetaCart
accuracy of the approximations to the expected value of functions of interest under the posterior. In this paper methods from spectral analysis are used to evaluate numerical accuracy formally and construct diagnostics for convergence. These methods are illustrated in the normal linear model
Simulating Physics with Computers
 SIAM Journal on Computing
, 1982
"... A digital computer is generally believed to be an efficient universal computing device; that is, it is believed able to simulate any physical computing device with an increase in computation time of at most a polynomial factor. This may not be true when quantum mechanics is taken into consideration. ..."
Abstract

Cited by 601 (1 self)
 Add to MetaCart
A digital computer is generally believed to be an efficient universal computing device; that is, it is believed able to simulate any physical computing device with an increase in computation time of at most a polynomial factor. This may not be true when quantum mechanics is taken into consideration
The Protection of Information in Computer Systems
, 1975
"... This tutorial paper explores the mechanics of protecting computerstored information from unauthorized use or modification. It concentrates on those architectural structureswhether hardware or softwarethat are necessary to support information protection. The paper develops in three main sections ..."
Abstract

Cited by 815 (2 self)
 Add to MetaCart
This tutorial paper explores the mechanics of protecting computerstored information from unauthorized use or modification. It concentrates on those architectural structureswhether hardware or softwarethat are necessary to support information protection. The paper develops in three main
Segmentation of brain MR images through a hidden Markov random field model and the expectationmaximization algorithm
 IEEE TRANSACTIONS ON MEDICAL. IMAGING
, 2001
"... The finite mixture (FM) model is the most commonly used model for statistical segmentation of brain magnetic resonance (MR) images because of its simple mathematical form and the piecewise constant nature of ideal brain MR images. However, being a histogrambased model, the FM has an intrinsic limi ..."
Abstract

Cited by 619 (14 self)
 Add to MetaCart
The finite mixture (FM) model is the most commonly used model for statistical segmentation of brain magnetic resonance (MR) images because of its simple mathematical form and the piecewise constant nature of ideal brain MR images. However, being a histogrambased model, the FM has an intrinsic limitationâ€”no spatial information is taken into account. This causes the FM model to work only on welldefined images with low levels of noise; unfortunately, this is often not the the case due to artifacts such as partial volume effect and bias field distortion. Under these conditions, FM modelbased methods produce unreliable results. In this paper, we propose a novel hidden Markov random field (HMRF) model, which is a stochastic process generated by a MRF whose state sequence cannot be observed directly but which can be indirectly estimated through observations. Mathematically, it can be shown that the FM model is a degenerate version of the HMRF model. The advantage of the HMRF model derives from the way in which the spatial information is encoded through the mutual influences of neighboring sites. Although MRF modeling has been employed in MR image segmentation by other researchers, most reported methods are limited to using MRF as a general prior in an FM modelbased approach. To fit the HMRF model, an EM algorithm is used. We show that by incorporating both the HMRF model and the EM algorithm into a HMRFEM framework, an accurate and robust segmentation can be achieved. More importantly, the HMRFEM framework can easily be combined with other techniques. As an example, we show how the bias field correction algorithm of Guillemaud and Brady (1997) can be incorporated into this framework to achieve a threedimensional fully automated approach for brain MR image segmentation.
Reversible jump Markov chain Monte Carlo computation and Bayesian model determination
 Biometrika
, 1995
"... Markov chain Monte Carlo methods for Bayesian computation have until recently been restricted to problems where the joint distribution of all variables has a density with respect to some xed standard underlying measure. They have therefore not been available for application to Bayesian model determi ..."
Abstract

Cited by 1330 (24 self)
 Add to MetaCart
Markov chain Monte Carlo methods for Bayesian computation have until recently been restricted to problems where the joint distribution of all variables has a density with respect to some xed standard underlying measure. They have therefore not been available for application to Bayesian model
A Survey of Computer VisionBased Human Motion Capture
 Computer Vision and Image Understanding
, 2001
"... A comprehensive survey of computer visionbased human motion capture literature from the past two decades is presented. The focus is on a general overview based on a taxonomy of system functionalities, broken down into four processes: initialization, tracking, pose estimation, and recognition. Each ..."
Abstract

Cited by 508 (14 self)
 Add to MetaCart
A comprehensive survey of computer visionbased human motion capture literature from the past two decades is presented. The focus is on a general overview based on a taxonomy of system functionalities, broken down into four processes: initialization, tracking, pose estimation, and recognition. Each
Nested Transactions: An Approach to Reliable Distributed Computing
, 1981
"... Distributed computing systems are being built and used more and more frequently. This distributod computing revolution makes the reliability of distributed systems an important concern. It is fairly wellunderstood how to connect hardware so that most components can continue to work when others are ..."
Abstract

Cited by 527 (1 self)
 Add to MetaCart
Distributed computing systems are being built and used more and more frequently. This distributod computing revolution makes the reliability of distributed systems an important concern. It is fairly wellunderstood how to connect hardware so that most components can continue to work when others
DecisionTheoretic Planning: Structural Assumptions and Computational Leverage
 JOURNAL OF ARTIFICIAL INTELLIGENCE RESEARCH
, 1999
"... Planning under uncertainty is a central problem in the study of automated sequential decision making, and has been addressed by researchers in many different fields, including AI planning, decision analysis, operations research, control theory and economics. While the assumptions and perspectives ..."
Abstract

Cited by 510 (4 self)
 Add to MetaCart
Planning under uncertainty is a central problem in the study of automated sequential decision making, and has been addressed by researchers in many different fields, including AI planning, decision analysis, operations research, control theory and economics. While the assumptions and perspectives adopted in these areas often differ in substantial ways, many planning problems of interest to researchers in these fields can be modeled as Markov decision processes (MDPs) and analyzed using the techniques of decision theory. This paper presents an overview and synthesis of MDPrelated methods, showing how they provide a unifying framework for modeling many classes of planning problems studied in AI. It also describes structural properties of MDPs that, when exhibited by particular classes of problems, can be exploited in the construction of optimal or approximately optimal policies or plans. Planning problems commonly possess structure in the reward and value functions used to de...
Results 1  10
of
1,074,885