Results 1  10
of
25
Global Optimization of Statistical Functions with Simulated Annealing
 Journal of Econometrics
, 1994
"... Many statistical methods rely on numerical optimization to estimate a model’s parameters. Unfortunately, conventional algorithms sometimes fail. Even when they do converge, there is no assurance that they have found the global, rather than a local, optimum. We test a new optimization algorithm, simu ..."
Abstract

Cited by 124 (1 self)
 Add to MetaCart
Many statistical methods rely on numerical optimization to estimate a model’s parameters. Unfortunately, conventional algorithms sometimes fail. Even when they do converge, there is no assurance that they have found the global, rather than a local, optimum. We test a new optimization algorithm, simulated annealing, on four econometric problems and compare it to three common conventional algorithms. Not only can simulated annealing find the global optimum, it is also less likely to fail on difficult functions because it is a very robust algorithm. The promise of simulated annealing is demonstrated on the four econometric problems.
Neural Networks and Statistical Models
, 1994
"... There has been much publicity about the ability of artificial neural networks to learn and generalize. In fact, the most commonly used artificial neural networks, called multilayer perceptrons, are nothing more than nonlinear regression and discriminant models that can be implemented with standard s ..."
Abstract

Cited by 99 (1 self)
 Add to MetaCart
There has been much publicity about the ability of artificial neural networks to learn and generalize. In fact, the most commonly used artificial neural networks, called multilayer perceptrons, are nothing more than nonlinear regression and discriminant models that can be implemented with standard statistical software. This paper explains what neural networks are, translates neural network jargon into statistical jargon, and shows the relationships between neural networks and statistical models such as generalized linear models, maximum redundancy analysis, projection pursuit, and cluster analysis.
Seer: Maximum Likelihood Regression for LearningSpeed Curves
 University of Illinois at
, 1995
"... The research presented here focuses on modeling machinelearning performance. The thesis introduces Seer, a system that generates empirical observations of classificationlearning performance and then uses those observations to create statistical models. The models can be used to predict the number ..."
Abstract

Cited by 10 (0 self)
 Add to MetaCart
The research presented here focuses on modeling machinelearning performance. The thesis introduces Seer, a system that generates empirical observations of classificationlearning performance and then uses those observations to create statistical models. The models can be used to predict the number of training examples needed to achieve a desired level and the maximum accuracy possible given an unlimited number of training examples. Seer advances the state of the art with 1) models that embody the best constraints for classification learning and most useful parameters, 2) algorithms that efficiently find maximumlikelihood models, and 3) a demonstration on realworld data from three domains of a practicable application of such modeling. The first part of the thesis gives an overview of the requirements for a good maximumlikelihood model of classificationlearning performance. Next, reasonable design choices for such models are explored. Selection among such models is a task of nonlinear programming, but by exploiting appropriate problem constraints, the task is reduced to a nonlinear regression task that can be solved with an efficient iterative algorithm. The latter part of the thesis describes almost 100 experiments in the domains of soybean disease, heart disease, and audiological problems. The tests show that Seer is excellent at characterizing learningperformance and that it seems to be as good as possible at predicting learning
CoVolatility and Correlation Clustering: A Multivariate Correlated ARCH Framework
, 2001
"... We present a new, full multivariate framework for modelling the evolution of conditional correlation between financial asset returns. Our approach assumes that a vector of asset returns is shocked by a vector innovation process the covariance matrix of which is timedependent. We then employ an appro ..."
Abstract

Cited by 3 (0 self)
 Add to MetaCart
We present a new, full multivariate framework for modelling the evolution of conditional correlation between financial asset returns. Our approach assumes that a vector of asset returns is shocked by a vector innovation process the covariance matrix of which is timedependent. We then employ an appropriate Cholesky decomposition of the asset covariance matrix which, when transformed using a Spherical decomposition allows for the modelling of conditional variances and correlations. The resulting asset covariance matrix is guaranteed to be positive definite at each point in time. We follow Christodoulakis and Satchell (2001) in designing conditionally autoregressive stochastic processes for the correlation coefficients and present analytical results for their distribution properties. Our approach allows for explicit outofsample forecasting of conditional correlations and generates a number of observed stylised facts such as timevarying correlations, persistence and correlation clustering, comovement between correlation coefficients, correlation and volatility as well as between volatility processes (covolatility). We also study analytically the comovement between the elements of the asset covariance matrix which are shown to depend on their persistence parameters. We provide empirical evidence on a trivariate model using monthy data from Dow Jones Industrial,
Asymmetric Information and Survival in Financial Markets
, 1999
"... In the evolutionary setting for a financial market developed by Blume and Easley (1992), we consider an innitely repeated version of a model la Grossman and Stiglitz (1980) with asymmetrically informed traders. Informed traders observe the realisation of a payoff relevant signal before making their ..."
Abstract

Cited by 2 (1 self)
 Add to MetaCart
In the evolutionary setting for a financial market developed by Blume and Easley (1992), we consider an innitely repeated version of a model la Grossman and Stiglitz (1980) with asymmetrically informed traders. Informed traders observe the realisation of a payoff relevant signal before making their portfolio decisions. Uninformed traders do not have direct access to this kind of information, but can partially infer it from market prices. As a counterpart for their privileged information, informed traders pay a per period cost. As a result, information acquisition triggers a tradeoff in our setting. We prove that, as long as information is costly, a strictly positive measure of uninformed traders will survive. This result contributes to the literature on noise trading. It suggests that Friedman (1953)s argument against the importance of noise traders in the process of price determination is too simplistic. Traders whose beliefs are wrong according to the best available information, in fact, are not wiped out by market forces and do affect asset prices in the long run.
ASSESSING THE VALUE OF DELAY TO TRUCKERS AND CARRIERS
, 2011
"... UTCM: DTRT06G0044 CFIRE: DTRT06G0020 ..."
Sourcing Decisions for Air Force Support Services
"... This documented briefing reports the results to date of one line of inquiry in that project: Where should the Air Force look within its support activities for additional outsourcing candidates? This briefing addresses a group of activities that the Air Force refers to as "commercial activities." By ..."
Abstract
 Add to MetaCart
This documented briefing reports the results to date of one line of inquiry in that project: Where should the Air Force look within its support activities for additional outsourcing candidates? This briefing addresses a group of activities that the Air Force refers to as "commercial activities." By definition, these activities are available in the private sector. That does not mean that the private sector can necessarily provide them more costeffectively than the Air Force can. But the Air Force buys many of these services from private sources already. Private sources should be available if the Air Force decides to buy additional services of this kind. This briefing focuses on two issues. First, of all the current activities that the Air Force has identified as commercial, how many has it already outsourced? How do outsourcing patterns depend on the major command and on activity type? When the effects of major commands and activity types are accounted for, how much crossinstallation variation remains? Second, the Office of Management and Budget requires the Air Force to use an "A76 program" to compare organic and contract costs for most commercial activities other than depot maintenance. (Depot maintenance is excluded from A76 cost comparison by Title 10, Section 2469 of the U.S. Code.) How has this program worked in different parts of the Air Force and for different kinds of activities? What can we infer from that about how the A76 program would affect additional outsourcing in the future? This briefing should be of interest to managers and analysts concerned with support matters in the Air Force, especially those involved in outsourcing and privatization, and to support services managers and contracting officials in the other military departments and in the Offic...
Ox Workshop
"... This document was prepared for the Ox course at the 9th ESTE (Escola de Series Temporais e Econometria) in Belo Horizonte, August 710 2001. It draws heavily on material developed with various coauthors, in particular Siem Jan Koopman, Marius Ooms, and Neil Shephard. The relevant references are Doo ..."
Abstract
 Add to MetaCart
This document was prepared for the Ox course at the 9th ESTE (Escola de Series Temporais e Econometria) in Belo Horizonte, August 710 2001. It draws heavily on material developed with various coauthors, in particular Siem Jan Koopman, Marius Ooms, and Neil Shephard. The relevant references are Doornik and Ooms (2001), Doornik and Ooms (1999), Koopman, Shephard and Doornik (1999). The Ox book (Doornik, 2001) contains the Ox reference documentation, a large part of which is also available in the online help. c #2001 J.A. Doornik iv Chapter 1
Neural Networks and Statistical Models
"... There has been much publicity about the ability of artificial neural networks to learn and generalize. In fact, the most commonly used artificial neural networks, called multilayer perceptrons, are nothing more than nonlinear regression and discriminant models that can be implemented with standard s ..."
Abstract
 Add to MetaCart
There has been much publicity about the ability of artificial neural networks to learn and generalize. In fact, the most commonly used artificial neural networks, called multilayer perceptrons, are nothing more than nonlinear regression and discriminant models that can be implemented with standard statistical software. This paper explains what neural networks are, translates neural network jargon into statistical jargon, and shows the relationships between neural networks and statistical models such as generalized linear models, maximum redundancy analysis, projection pursuit, and cluster analysis.