Results 1  10
of
47
Global Optimization of Statistical Functions with Simulated Annealing
 Journal of Econometrics
, 1994
"... Many statistical methods rely on numerical optimization to estimate a model’s parameters. Unfortunately, conventional algorithms sometimes fail. Even when they do converge, there is no assurance that they have found the global, rather than a local, optimum. We test a new optimization algorithm, simu ..."
Abstract

Cited by 281 (2 self)
 Add to MetaCart
Many statistical methods rely on numerical optimization to estimate a model’s parameters. Unfortunately, conventional algorithms sometimes fail. Even when they do converge, there is no assurance that they have found the global, rather than a local, optimum. We test a new optimization algorithm, simulated annealing, on four econometric problems and compare it to three common conventional algorithms. Not only can simulated annealing find the global optimum, it is also less likely to fail on difficult functions because it is a very robust algorithm. The promise of simulated annealing is demonstrated on the four econometric problems.
Neural Networks and Statistical Models
, 1994
"... There has been much publicity about the ability of artificial neural networks to learn and generalize. In fact, the most commonly used artificial neural networks, called multilayer perceptrons, are nothing more than nonlinear regression and discriminant models that can be implemented with standard s ..."
Abstract

Cited by 137 (1 self)
 Add to MetaCart
There has been much publicity about the ability of artificial neural networks to learn and generalize. In fact, the most commonly used artificial neural networks, called multilayer perceptrons, are nothing more than nonlinear regression and discriminant models that can be implemented with standard statistical software. This paper explains what neural networks are, translates neural network jargon into statistical jargon, and shows the relationships between neural networks and statistical models such as generalized linear models, maximum redundancy analysis, projection pursuit, and cluster analysis.
The validity of environmental benefits transfer: further empirical testing. Environmental Economics and Management
, 1999
"... Abstract. This paper provides further empirical evidence of the validity of environmental benefits transfer based on CV studies by expanding the analysis to include control factors which have not been accounted for in previous studies. These factors refer to differences in respondent attitudes. Trad ..."
Abstract

Cited by 55 (3 self)
 Add to MetaCart
(Show Context)
Abstract. This paper provides further empirical evidence of the validity of environmental benefits transfer based on CV studies by expanding the analysis to include control factors which have not been accounted for in previous studies. These factors refer to differences in respondent attitudes. Traditional population characteristics were taken into account, but these variables do not explain why respondents from the same socioeconomic group may still hold different beliefs, norms or values and hence have different attitudes and consequently state different WTP amounts. The test results are mixed. The function transfer approach is valid in one case, but is rejected in the 3 other cases investigated in this paper. We provide further evidence that in the case of statistically valid benefits transfer, the function approach results in a more robust benefits transfer than the unit value approach. We also show that the equality of coefficient estimates is a necessary, but insufficient condition for valid benefit function transfer and discuss the implications for previous and future validity testing.
An endogenous segmentation mode choice model with an application to intercity travel
 Transportation Science
, 1997
"... This paper uses an endogenous segmentation approach to model mode choice. This approach jointly determines the number of market segments in the travel population, assigns individuals probabilistically to each segment, and develops a distinct mode choice model for each segment group. The author propo ..."
Abstract

Cited by 31 (5 self)
 Add to MetaCart
(Show Context)
This paper uses an endogenous segmentation approach to model mode choice. This approach jointly determines the number of market segments in the travel population, assigns individuals probabilistically to each segment, and develops a distinct mode choice model for each segment group. The author proposes a stable and effective hybrid estimation approach for the endogenous segmentation model that combines an ExpectationMaximization (EM) algorithm with standard likelihood maximization routines. If access to general maximumlikelihood software is not available, the multinomiallogit based EM algorithm can be used in isolation. The endogenous segmentation model and other commonly used models in the travel demand field to capture systematic heterogeneity are estimated using a Canadian intercity mode choice dataset. The results show that the endogenous segmentation model fits the data best and provides intuitively more reasonable results compared to the other approaches.
CoVolatility and Correlation Clustering: A Multivariate Correlated ARCH Framework
, 2001
"... We present a new, full multivariate framework for modelling the evolution of conditional correlation between financial asset returns. Our approach assumes that a vector of asset returns is shocked by a vector innovation process the covariance matrix of which is timedependent. We then employ an appro ..."
Abstract

Cited by 19 (0 self)
 Add to MetaCart
We present a new, full multivariate framework for modelling the evolution of conditional correlation between financial asset returns. Our approach assumes that a vector of asset returns is shocked by a vector innovation process the covariance matrix of which is timedependent. We then employ an appropriate Cholesky decomposition of the asset covariance matrix which, when transformed using a Spherical decomposition allows for the modelling of conditional variances and correlations. The resulting asset covariance matrix is guaranteed to be positive definite at each point in time. We follow Christodoulakis and Satchell (2001) in designing conditionally autoregressive stochastic processes for the correlation coefficients and present analytical results for their distribution properties. Our approach allows for explicit outofsample forecasting of conditional correlations and generates a number of observed stylised facts such as timevarying correlations, persistence and correlation clustering, comovement between correlation coefficients, correlation and volatility as well as between volatility processes (covolatility). We also study analytically the comovement between the elements of the asset covariance matrix which are shown to depend on their persistence parameters. We provide empirical evidence on a trivariate model using monthy data from Dow Jones Industrial,
A Model of Price Determination for Fresh Produce with Application to California Iceberg Lettuce
 American Journal of Agricultural Economics
"... Most fresh produce commodities are highly perishable. Thus, supply at any time is fixed at prices above marginal harvest costs. In this paper we develop a model of shortrun farm price determination for produce commodities that incorporates this key structural feature and also allows for imperfect c ..."
Abstract

Cited by 17 (4 self)
 Add to MetaCart
Most fresh produce commodities are highly perishable. Thus, supply at any time is fixed at prices above marginal harvest costs. In this paper we develop a model of shortrun farm price determination for produce commodities that incorporates this key structural feature and also allows for imperfect competition in price determination. The model is empirically manifest in a switching regression framework and applied to California iceberg lettuce. Results support the general model relative to a model restricted to competitive behavior and suggest that retailer/buyers capture most of the rents from lettuce production and sale. Key words: iceberg lettuce, imperfect competition, price determination, switching regression. In this paper we address the issue of shortrun price determination for produce commodities in markets that may be imperfectly competitive. We develop a new model of farm price determination and apply it to the California iceberg lettuce industry. Because produce is highly perishable and unstoreable, the potential supply at any point in time is determined by the available harvest, which itself is the consequence of planting decisions made months (in the case of vegetables or strawberries) or years (in the case of stone fruits) in advance. The shortrun supply curve is, thus, perfectly inelastic over a range of prices. I However, the marginal cost of harvesting establishes a lower bound on the market price because no product will be offered at prices below harvest costs. This constraint on pricing appears to bind with surprising frequency as figure 1 illustrates for lettuce, where roughly onethird of the weekly observations for 198892 are at or near the harvest cost floor of approximately $3.25 per carton. The inelastic shortrun supply relationship precludes application of traditional industrial Richard J. Sexton is professor and chair, and a member of the
Calculating errors for measures derived from choice modelling estimates, paper presented at the 88th Annual Meeting of the Transportation Research Board
, 2009
"... The calibration of choice models produces a set of parameter estimates and an associated covariance matrix, usually based on maximum likelihood estimation. However, in many cases, the values of interest to analysts are in fact functions of these parameters rather than the parameters themselves. It ..."
Abstract

Cited by 14 (6 self)
 Add to MetaCart
The calibration of choice models produces a set of parameter estimates and an associated covariance matrix, usually based on maximum likelihood estimation. However, in many cases, the values of interest to analysts are in fact functions of these parameters rather than the parameters themselves. It is thus also crucial to have a measure of variance for these derived quantities and it is preferable that this can be guaranteed to have the maximum likelihood properties, such as minimum variance. While the calculation of standard errors using the Delta method has been described for a number of such measures in the literature, including the ratio of two parameters, these results are often seen to be approximate calculations and do not claim maximum likelihood properties. In this paper, we show that many measures commonly used in transport studies and elsewhere are themselves maximum likelihood estimates and that the standard errors are thus exact, a point we illustrate for a substantial number of commonly used functions. We also discuss less appropriate methods, notably highlighting the issues with using simulation for obtaining the variance of a function of estimates.
Seer: Maximum Likelihood Regression for LearningSpeed Curves
 University of Illinois at
, 1995
"... The research presented here focuses on modeling machinelearning performance. The thesis introduces Seer, a system that generates empirical observations of classificationlearning performance and then uses those observations to create statistical models. The models can be used to predict the number ..."
Abstract

Cited by 11 (0 self)
 Add to MetaCart
The research presented here focuses on modeling machinelearning performance. The thesis introduces Seer, a system that generates empirical observations of classificationlearning performance and then uses those observations to create statistical models. The models can be used to predict the number of training examples needed to achieve a desired level and the maximum accuracy possible given an unlimited number of training examples. Seer advances the state of the art with 1) models that embody the best constraints for classification learning and most useful parameters, 2) algorithms that efficiently find maximumlikelihood models, and 3) a demonstration on realworld data from three domains of a practicable application of such modeling. The first part of the thesis gives an overview of the requirements for a good maximumlikelihood model of classificationlearning performance. Next, reasonable design choices for such models are explored. Selection among such models is a task of nonlinear programming, but by exploiting appropriate problem constraints, the task is reduced to a nonlinear regression task that can be solved with an efficient iterative algorithm. The latter part of the thesis describes almost 100 experiments in the domains of soybean disease, heart disease, and audiological problems. The tests show that Seer is excellent at characterizing learningperformance and that it seems to be as good as possible at predicting learning
Exploring the dynamics of international trade by combining the comparative advantages of multivariate statistics and network visualizations
 Journal of social structure
, 2003
"... This paper contributes to an ongoing debate in International Political Economy about the appropriateness of globalization, regionalization and macroeconomic imbalance theory by identifying quantitative estimates for all three tendencies from world trade data. This is achieved with a series of gravit ..."
Abstract

Cited by 6 (0 self)
 Add to MetaCart
This paper contributes to an ongoing debate in International Political Economy about the appropriateness of globalization, regionalization and macroeconomic imbalance theory by identifying quantitative estimates for all three tendencies from world trade data. This is achieved with a series of gravity models enhanced stepwise by the mapping of the estimation errors of a given model on representations of the overall structure of trade. This not only allows the identification of imperfections in a given model but also permits the further improvement of the models since any systematic regional organization in the errorterms can be identified. The results of the most elaborated model indicate that single factor explanations of global economic integration are presumably misleading. Instead, each of three explanations captures only part of the ongoing changes, as they can be identified under a comparative static perspective from world trade data. 1
Asymmetric Information and Survival in Financial Markets
, 1999
"... In the evolutionary setting for a financial market developed by Blume and Easley (1992), we consider an innitely repeated version of a model la Grossman and Stiglitz (1980) with asymmetrically informed traders. Informed traders observe the realisation of a payoff relevant signal before making their ..."
Abstract

Cited by 3 (1 self)
 Add to MetaCart
(Show Context)
In the evolutionary setting for a financial market developed by Blume and Easley (1992), we consider an innitely repeated version of a model la Grossman and Stiglitz (1980) with asymmetrically informed traders. Informed traders observe the realisation of a payoff relevant signal before making their portfolio decisions. Uninformed traders do not have direct access to this kind of information, but can partially infer it from market prices. As a counterpart for their privileged information, informed traders pay a per period cost. As a result, information acquisition triggers a tradeoff in our setting. We prove that, as long as information is costly, a strictly positive measure of uninformed traders will survive. This result contributes to the literature on noise trading. It suggests that Friedman (1953)s argument against the importance of noise traders in the process of price determination is too simplistic. Traders whose beliefs are wrong according to the best available information, in fact, are not wiped out by market forces and do affect asset prices in the long run.