Results 1  10
of
43
Matching as Nonparametric Preprocessing for Reducing Model Dependence
 in Parametric Causal Inference,” Political Analysis
, 2007
"... Although published works rarely include causal estimates from more than a few model specifications, authors usually choose the presented estimates from numerous trial runs readers never see. Given the often large variation in estimates across choices of control variables, functional forms, and other ..."
Abstract

Cited by 106 (34 self)
 Add to MetaCart
Although published works rarely include causal estimates from more than a few model specifications, authors usually choose the presented estimates from numerous trial runs readers never see. Given the often large variation in estimates across choices of control variables, functional forms, and other modeling assumptions, how can researchers ensure that the few estimates presented are accurate or representative? How do readers know that publications are not merely demonstrations that it is possible to find a specification that fits the author’s favorite hypothesis? And how do we evaluate or even define statistical properties like unbiasedness or mean squared error when no unique model or estimator even exists? Matching methods, which offer the promise of causal inference with fewer assumptions, constitute one possible way forward, but crucial results in this fastgrowing methodological
Causal Parameters and Policy Analysis in Economics: A Twentieth Century Retrospective." Quarterly Journal of Economics 115 (February
 In MeansTested Transfers in the
"... JEL No. C10 The major contributions of twentieth century econometrics to knowledge were the definition of causal parameters when agents are constrained by resources and markets and causes are interrelated, the analysis of what is required to recover causal parameters from data (the identification pr ..."
Abstract

Cited by 60 (4 self)
 Add to MetaCart
JEL No. C10 The major contributions of twentieth century econometrics to knowledge were the definition of causal parameters when agents are constrained by resources and markets and causes are interrelated, the analysis of what is required to recover causal parameters from data (the identification problem), and clarification of the role of causal parameters in policy evaluation and in forecasting the effects of policies never previously experienced. This paper summarizes the development of those ideas by the Cowles Commission, the response to their work by structural econometricians and VAR econometricians, and the response to structural and VAR econometrics by calibrators, advocates of natural and social experiments, and by nonparametric econometricians and statisticians.
Program Evaluation as a Decision Problem
, 2002
"... I argue for thinking of program evaluation as a decision problem. There are two steps. First, a counselor determines which program (treatment or control) each individual joins, based for example on maximizing the probability of employment or expected earnings. Second, the policymaker decides whether ..."
Abstract

Cited by 30 (0 self)
 Add to MetaCart
I argue for thinking of program evaluation as a decision problem. There are two steps. First, a counselor determines which program (treatment or control) each individual joins, based for example on maximizing the probability of employment or expected earnings. Second, the policymaker decides whether: to assign all individuals to treatment or to control, or to allow the counselor to choose. This framework has two advantages. Individualized assignment rules (known as profiling) can raise the average impact, improving cost effectiveness by exploiting treatmentimpact heterogeneity. Second, it accounts systematically for inequality and uncertainty, and the policymaker’s attitude toward these, in the evaluation.
THE SCIENTIFIC MODEL OF CAUSALITY
, 2005
"... Causality is a very intuitive notion that is difficult to make precise without lapsing into tautology. Two ingredients are central to any definition: (1) a set of possible outcomes (counterfactuals) generated by a function of a set of ‘‘factors’ ’ or ‘‘determinants’ ’ and (2) a manipulation where on ..."
Abstract

Cited by 23 (2 self)
 Add to MetaCart
Causality is a very intuitive notion that is difficult to make precise without lapsing into tautology. Two ingredients are central to any definition: (1) a set of possible outcomes (counterfactuals) generated by a function of a set of ‘‘factors’ ’ or ‘‘determinants’ ’ and (2) a manipulation where one (or more) of the ‘‘factors’ ’ or ‘‘determinants’’ is changed. An effect is realized as a change in the argument of a stable function that produces the same change in the outcome for a class of interventions that change the ‘‘factors’ ’ by the same amount. The outcomes are compared at different levels of the factors or generating variables. Holding all factors save one at a constant level, the change in the outcome associated with manipulation of the varied factor is called a causal effect of the manipulated factor. This definition, or some version of it, goes back to Mill (1848) and Marshall (1890). Haavelmo’s (1943) made it more precise within the context of linear equations models. The phrase ‘ceteris paribus’ (everything else held constant) is a mainstay of economic analysis
Aspects Of Graphical Models Connected With Causality
, 1993
"... This paper demonstrates the use of graphs as a mathematical tool for expressing independenices, and as a formal language for communicating and processing causal information in statistical analysis. We show how complex information about external interventions can be organized and represented graphica ..."
Abstract

Cited by 13 (10 self)
 Add to MetaCart
This paper demonstrates the use of graphs as a mathematical tool for expressing independenices, and as a formal language for communicating and processing causal information in statistical analysis. We show how complex information about external interventions can be organized and represented graphically and, conversely, how the graphical representation can be used to facilitate quantitative predictions of the effects of interventions. We first review the Markovian account of causation and show that directed acyclic graphs (DAGs) offer an economical scheme for representing conditional independence assumptions and for deducing and displaying all the logical consequences of such assumptions. We then introduce the manipulative account of causation and show that any DAG defines a simple transformation which tells us how the probability distribution will change as a result of external interventions in the system. Using this transformation it is possible to quantify, from nonexperimental data...
Does Temporary Agency Work Provide a Stepping Stone to Regular Employment?” Unpublished working paper
, 2005
"... Based on administrative data from the federal employment o¢ce in Germany, we apply matching techniques to estimate the steppingstone function of temporary agency work for the unemployed, i.e. its shortrun and longrun e¤ects on their future employment prospects.Our results show that unemployed wor ..."
Abstract

Cited by 11 (0 self)
 Add to MetaCart
Based on administrative data from the federal employment o¢ce in Germany, we apply matching techniques to estimate the steppingstone function of temporary agency work for the unemployed, i.e. its shortrun and longrun e¤ects on their future employment prospects.Our results show that unemployed workers who take up a job in the temporary work agency (TWA) industry are on average more likely than unemployed workers not joining TWA work to be employed (TWA or regular job) for up to 18 months and to exhibit lower monthly unemployment probabilities for about 6 months. However, we …nd no discernable medium to longterm e¤ect on monthly employment probablities for a regular job. Our …ndings therefore do not lend support to the steppingstone function of temporary agency work. If anything, a statistically signi…cant, but shortterm accesstowork function of TWA work emerges from our (yet preliminary) empirical analysis.
What if the UK or Sweden had joined the Euro in 1999? An empirical evaluation using a global VAR
 International Journal of Finance and Economics
, 2007
"... This paper attempts to provide a conceptual framework for the analysis of counterfactual scenarios using macroeconometric models. As an application we consider UK entry to the euro. Entry involves a longterm commitment to restrict UK nominal exchange rates and interest rates to be the same as those ..."
Abstract

Cited by 9 (1 self)
 Add to MetaCart
This paper attempts to provide a conceptual framework for the analysis of counterfactual scenarios using macroeconometric models. As an application we consider UK entry to the euro. Entry involves a longterm commitment to restrict UK nominal exchange rates and interest rates to be the same as those of the euro area. We derive conditional probability distributions for the difference between the future realisations of variables of interest (e.g UK and euro area output and prices) subject to UK entry restrictions being fully met over a given period and the alternative realisations without the restrictions. The robustness of the results can be evaluated by also conditioning on variables deemed to be invariant to UK entry, such as oil or US equity prices. Economic interdependence means that such policy evaluation must take account of international linkages and common factors that drive fluctuations across economies. In this paper this is accomplished using the Global VAR recently developed by Dees, di Mauro, Pesaran and Smith (2006). The paper briefly describes the GVAR which has been estimated for 25 countries and the euro area over the period 19792003. It reports probability estimates that output will be higher and prices lower in the UK and the euro area as a result of entry. It examines the sensitivity of these results to a variety of assumptions about when and how the UK entered and the observed global shocks and compares them with the effects of Swedish entry.
Randomization does not justify logistic regression
 ADVANCES IN APPLIED MATHEMATICS
, 2008
"... Logit models are often used to analyze experimental data. However, randomization does not justify the model, and estimators may be inconsistent. Here, Neyman’s nonparametric setup is used as a benchmark. Each subject has two potential responses, one if treated and the other if untreated; only one o ..."
Abstract

Cited by 7 (1 self)
 Add to MetaCart
Logit models are often used to analyze experimental data. However, randomization does not justify the model, and estimators may be inconsistent. Here, Neyman’s nonparametric setup is used as a benchmark. Each subject has two potential responses, one if treated and the other if untreated; only one of the two responses is observed. A consistent estimator is proposed for use with the logit model. There is a brief literature review, and some recommendations for practice.
Quantile Treatment Effects in the Regression Discontinuity Design: Process Results and Gini Coefficient
, 2010
"... ..."
Logicist Statistics I. Models and Modeling
 Statistical Science
, 1998
"... Abstract. Arguments are presented to support increased emphasis on logical aspects of formal methods of analysis, depending on probability in the sense of R. A. Fisher. Formulating probabilistic models that convey uncertain knowledge of objective phenomena and using such models for inductive reasoni ..."
Abstract

Cited by 6 (0 self)
 Add to MetaCart
Abstract. Arguments are presented to support increased emphasis on logical aspects of formal methods of analysis, depending on probability in the sense of R. A. Fisher. Formulating probabilistic models that convey uncertain knowledge of objective phenomena and using such models for inductive reasoning are central activities of individuals that introduce limited but necessary subjectivity into science. Statistical models are classified into overlapping types called here empirical, stochastic and predictive, all drawing on a common mathematical theory of probability, and all facilitating statements with logical and epistemic content. Contexts in which these ideas are intended to apply are discussed via three major examples. Key words and phrases: Logicism and proceduralism; specificity of analysis; formal subjective probability; complementarity; subjective and objective; formal and informal; empirical, stochastic and predictive models; U.S. national census; screening for chronic disease; global climate change.