Results 1  10
of
13
Causal Parameters and Policy Analysis in Economics: A Twentieth Century Retrospective." Quarterly Journal of Economics 115 (February
 In MeansTested Transfers in the
"... JEL No. C10 The major contributions of twentieth century econometrics to knowledge were the definition of causal parameters when agents are constrained by resources and markets and causes are interrelated, the analysis of what is required to recover causal parameters from data (the identification pr ..."
Abstract

Cited by 57 (4 self)
 Add to MetaCart
JEL No. C10 The major contributions of twentieth century econometrics to knowledge were the definition of causal parameters when agents are constrained by resources and markets and causes are interrelated, the analysis of what is required to recover causal parameters from data (the identification problem), and clarification of the role of causal parameters in policy evaluation and in forecasting the effects of policies never previously experienced. This paper summarizes the development of those ideas by the Cowles Commission, the response to their work by structural econometricians and VAR econometricians, and the response to structural and VAR econometrics by calibrators, advocates of natural and social experiments, and by nonparametric econometricians and statisticians.
From association to causation: Some remarks on the history of statistics
 Statist. Sci
, 1999
"... The “numerical method ” in medicine goes back to Pierre Louis ’ study of pneumonia (1835), and John Snow’s book on the epidemiology of cholera (1855). Snow took advantage of natural experiments and used convergent lines of evidence to demonstrate that cholera is a waterborne infectious disease. More ..."
Abstract

Cited by 23 (6 self)
 Add to MetaCart
The “numerical method ” in medicine goes back to Pierre Louis ’ study of pneumonia (1835), and John Snow’s book on the epidemiology of cholera (1855). Snow took advantage of natural experiments and used convergent lines of evidence to demonstrate that cholera is a waterborne infectious disease. More recently, investigators in the social and life sciences have used statistical models and significance tests to deduce causeandeffect relationships from patterns of association; an early example is Yule’s study on the causes of poverty (1899). In my view, this modeling enterprise has not been successful. Investigators tend to neglect the difficulties in establishing causal relations, and the mathematical complexities obscure rather than clarify the assumptions on which the analysis is based. Formal statistical inference is, by its nature, conditional. If maintained hypotheses A, B, C,... hold, then H can be tested against the data. However, if A, B, C,... remain in doubt, so must inferences about H. Careful scrutiny of maintained hypotheses should therefore be a critical part of empirical work—a principle honored more often in the breach than the observance. Snow’s work on cholera will be contrasted with modern studies that depend on statistical models and tests of significance. The examples may help to clarify the limits of current statistical techniques for making causal inferences from patterns of association. 1.
On specifying graphical models for causation, and the identification problem
 Evaluation Review
, 2004
"... This paper (which is mainly expository) sets up graphical models for causation, having a bit less than the usual complement of hypothetical counterfactuals. Assuming the invariance of error distributions may be essential for causal inference, but the errors themselves need not be invariant. Graphs c ..."
Abstract

Cited by 18 (1 self)
 Add to MetaCart
This paper (which is mainly expository) sets up graphical models for causation, having a bit less than the usual complement of hypothetical counterfactuals. Assuming the invariance of error distributions may be essential for causal inference, but the errors themselves need not be invariant. Graphs can be interpreted using conditional distributions, so that we can better address connections between the mathematical framework and causality in the world. The identification problem is posed in terms of conditionals. As will be seen, causal relationships cannot be inferred from a data set by running regressions unless there is substantial prior knowledge about the mechanisms that generated the data. There are few successful applications of graphical models, mainly because few causal pathways can be excluded on a priori grounds. The invariance conditions themselves remain to be assessed.
From association to causation via regression
 Indiana: University of Notre Dame
, 1997
"... For nearly a century, investigators in the social sciences have used regression models to deduce causeandeffect relationships from patterns of association. Path models and automated search procedures are more recent developments. In my view, this enterprise has not been successful. The models tend ..."
Abstract

Cited by 16 (6 self)
 Add to MetaCart
For nearly a century, investigators in the social sciences have used regression models to deduce causeandeffect relationships from patterns of association. Path models and automated search procedures are more recent developments. In my view, this enterprise has not been successful. The models tend to neglect the difficulties in establishing causal relations, and the mathematical complexities tend to obscure rather than clarify the assumptions on which the analysis is based. Formal statistical inference is, by its nature, conditional. If maintained hypotheses A, B, C,... hold, then H can be tested against the data. However, if A, B, C,... remain in doubt, so must inferences about H. Careful scrutiny of maintained hypotheses should therefore be a critical part of empirical work a principle honored more often in the breach than the observance.
A Parallel CuttingPlane Algorithm for the Vehicle Routing Problem With Time Windows
, 1999
"... In the vehicle routing problem with time windows a number of identical vehicles must be routed to and from a depot to cover a given set of customers, each of whom has a specified time interval indicating when they are available for service. Each customer also has a known demand, and a vehicle may on ..."
Abstract

Cited by 11 (1 self)
 Add to MetaCart
In the vehicle routing problem with time windows a number of identical vehicles must be routed to and from a depot to cover a given set of customers, each of whom has a specified time interval indicating when they are available for service. Each customer also has a known demand, and a vehicle may only serve the customers on a route if the total demand does not exceed the capacity of the vehicle. The most effective solution method proposed to date for this problem is due to Kohl, Desrosiers, Madsen, Solomon, and Soumis. Their algorithm uses a cuttingplane approach followed by a branchand bound search with column generation, where the columns of the LP relaxation represent routes of individual vehicles. We describe a new implementation of their method, using Karger's randomized minimumcut algorithm to generate cutting planes. The standard benchmark in this area is a set of 87 problem instances generated in 1984 by M. Solomon; making using of parallel processing in both the cuttingpla...
The Promises and Perils of AgentBased Computational Economics”, LABORatorio Revelli Working Paper No
, 2003
"... LABOR is an independent research centre within Coripe Piemonte ..."
Abstract

Cited by 2 (1 self)
 Add to MetaCart
LABOR is an independent research centre within Coripe Piemonte
The Place of Statistical Modelling in Management Science: Critical Realism and Multimethodology Abstract
, 2003
"... Traditional “hard ” OR has been based on quantitative modelling and embodies, usually implicitly, a positivist or empiricist philosophy. “Soft OR”, for instance problem structuring methods such as SSM, developed as an antithesis and embodied an interpretivist or constructivist philosophy. This has g ..."
Abstract
 Add to MetaCart
Traditional “hard ” OR has been based on quantitative modelling and embodies, usually implicitly, a positivist or empiricist philosophy. “Soft OR”, for instance problem structuring methods such as SSM, developed as an antithesis and embodied an interpretivist or constructivist philosophy. This has generated somewhat of a schism between the two sides. Previous papers have advocated critical realism as a philosophy of science that can potentially provide a dialectical synthesis in recognising both the value and limitations of these approaches. This paper explores the critical realist critique of quantitative modelling, as exemplified by multivariate statistics, and argues that its grounds must be reconceptualised within a multimethodological framework. Keywords:
Econometrics: A Bird’s Eye View ∗
, 2006
"... As a unified discipline, econometrics is still relatively young and has been transforming and expanding very rapidly over the past few decades. Major advances have taken place in the analysis of cross sectional data by means of semiparametric and nonparametric techniques. Heterogeneity of economic ..."
Abstract
 Add to MetaCart
As a unified discipline, econometrics is still relatively young and has been transforming and expanding very rapidly over the past few decades. Major advances have taken place in the analysis of cross sectional data by means of semiparametric and nonparametric techniques. Heterogeneity of economic relations across individuals, firms and industries is increasingly acknowledged and attempts have been made to take them into account either by integrating out their effects or by modeling the sources of heterogeneity when suitable panel data exists. The counterfactual considerations that underlie policy analysis and treatment evaluation have been given a more satisfactory foundation. New time series econometric techniques have been developed and employed extensively in the areas of macroeconometrics and finance. Nonlinear econometric techniques are used increasingly in the analysis of cross section and time series observations. Applications of Bayesian techniques to econometric problems have been given new impetus largely thanks to advances in computer power and computational techniques. The use of Bayesian techniques have in turn provided the investigators with a unifying framework where the tasks of forecasting, decision making, model evaluation and learning can be considered as parts of the same interactive and iterative process; thus paving the way for establishing the foundation of “real time econometrics”. This paper attempts to provide an overview of some of these developments.
Model Identification and Nonunique Structure
 University of Oxford
, 2002
"... Identification is an essential attribute of any model's parameters, so we consider its three aspects of `uniqueness', `correspondence to reality' and `interpretability'. Observationallyequivalent overidentified models can coexist, and are mutually encompassing in the population; correctlyidenti ..."
Abstract
 Add to MetaCart
Identification is an essential attribute of any model's parameters, so we consider its three aspects of `uniqueness', `correspondence to reality' and `interpretability'. Observationallyequivalent overidentified models can coexist, and are mutually encompassing in the population; correctlyidentified models need not correspond to the underlying structure; and may be wrongly interpreted. That a given model is overidentified with all overidentifying restrictions valid (even asymptotically) is insufficient to demonstrate that it is a unique representation. Moreover, structure (as invariance under extended information) need not be identifiable. We consider the role of structural breaks to discriminate between such representations.
Estimation, Economic methodology
, 2004
"... In this paper I analyse the main strengths and weaknesses of agentbased computational models. I first describe how agentbased simulations can complement more traditional modelling techniques. Then, I rationalise the main theoretical critiques against the use of simulation, which point to the follo ..."
Abstract
 Add to MetaCart
In this paper I analyse the main strengths and weaknesses of agentbased computational models. I first describe how agentbased simulations can complement more traditional modelling techniques. Then, I rationalise the main theoretical critiques against the use of simulation, which point to the following problematic areas: (i) interpretation of the simulation dynamics, (ii) estimation of the simulation model, and (iii) generalisation of the results. I show that there exist solutions for all these issues. Along the way, I clarify some confounding differences in terminology between the computer science and the economic literature.