Results 1  10
of
159
Sources of advantageous Selection: Evidence from the Medigap insurance market
, 2006
"... We provide strong evidence of advantageous selection in the Medigap insurance market, and analyze its sources. Using Medicare Current Beneficiary Survey (MCBS) data, we find that, conditional on controls for the price of Medigap, medical expenditures for senior citizens with Medigap coverage are, on ..."
Abstract

Cited by 29 (3 self)
 Add to MetaCart
We provide strong evidence of advantageous selection in the Medigap insurance market, and analyze its sources. Using Medicare Current Beneficiary Survey (MCBS) data, we find that, conditional on controls for the price of Medigap, medical expenditures for senior citizens with Medigap coverage are, on average, about $4,000 less than for those without. But, if we condition on health, expenditures for seniors on Medigap are about $2,000 more. These two findings can only be reconciled if those with less health expenditure risk are more likely to purchase Medigap, implying advantageous selection. By combining the MCBS and the Health and Retirement Study (HRS), we investigate the sources of this advantageous selection. These include income, education, longevity expectations and financial planing horizons, as well as cognitive ability. Once we condition on all these factors, seniors with higher expected medical expenditure are indeed more likely to purchase Medigap. Surprisingly, risk preferences do not appear to be a source of advantageous selection. But cognitive ability emerges as a particularly important factor, consistent with a view that many senior citizens have difficulty understanding Medicare and Medigap rules.
An economic model of user rating in an online recommender system
 In Proceedings of The 10th International Conference on User Modeling
, 2005
"... Abstract. Economic modeling provides a formal mechanism to understand user incentives and behavior in online systems. In this paper we describe the process of building a parameterized economic model of usercontributed ratings in an online movie recommender system. We constructed a theoretical model ..."
Abstract

Cited by 20 (4 self)
 Add to MetaCart
Abstract. Economic modeling provides a formal mechanism to understand user incentives and behavior in online systems. In this paper we describe the process of building a parameterized economic model of usercontributed ratings in an online movie recommender system. We constructed a theoretical model to formalize our initial understanding of the system, and collected survey and behavioral data to calibrate an empirical model. This model explains 34 % of the variation in user rating behavior. We found that while economic modeling in this domain requires an initial understanding of user behavior and access to an uncommonly broad set of user survey and behavioral data, it returns significant formal understanding of the activity being modeled. 1
Employee sentiment and stock option compensation, MIT working paper
, 2004
"... Preliminary and incomplete 3 The use of broad equitybased compensation for employees in the lower ranks of an organization is a puzzle for standard economic theory: any positive incentive effects should be diminished by free rider problems, and undiversified employees should discount company equity ..."
Abstract

Cited by 19 (1 self)
 Add to MetaCart
Preliminary and incomplete 3 The use of broad equitybased compensation for employees in the lower ranks of an organization is a puzzle for standard economic theory: any positive incentive effects should be diminished by free rider problems, and undiversified employees should discount company equity heavily. We point out that employees do not appear to value company stock as prescribed by extant theory. Employees frequently purchase company stock for their 401(k) plans at market prices, and especially so after company stock has performed well, implying that their private valuation must at least equal the market price. We begin by developing a model of optimal compensation policy for a firm faced with employees with positive sentiment. Our goal is to establish the conditions necessary for the firm to compensate its employees with options in equilibrium, while explicitly taking into account that current and potential employees are able to purchase equity in the firm through the stock market. We show that using option compensation under these circumstances is not a puzzle if employees prefer the (nontraded) options offered by the firm to the (traded) equity offered by the market, or if the (traded) equity is overvalued. We then provide empirical evidence confirming that firms use broadbased option compensation when boundedly rational employees
Neighbourhood Effects and House Demand
 Journal of Applied Econometrics
, 2003
"... In this paper, we estimate a model of housing demand with neighborhood effects. We exploit special features of the National sample of the American Housing Survey and properties of housing markets that allow us to create “natural ” instruments and therefore identify the impact of social interactions. ..."
Abstract

Cited by 15 (0 self)
 Add to MetaCart
In this paper, we estimate a model of housing demand with neighborhood effects. We exploit special features of the National sample of the American Housing Survey and properties of housing markets that allow us to create “natural ” instruments and therefore identify the impact of social interactions. We find evidence of both endogenous and contextual neighborhood effects. We report two alternative sets of estimates for neighborhood effects that differ in terms of the instruments we use for estimating the model. When the endogenous neighborhood effect is large the respective contextual effects are weak, and vice versa. The elasticity of housing demand with respect to the mean of the neighbors ’ housing demands (the endogenous effect) ranges from 0.19 to 0.66 and is generally very significant. The contextual effects are also very significant. A key such effect, the elasticity with respect to the mean of neighbors ’ permanent incomes ranges from 0.17 to 0.54.
ER 2 : An intuitive similarity measure for online signature verification
 In Proc. IWFHR9, Kimura
, 2004
"... ER 2 (Extended Rsquared) is proposed as a similarity measure for online signature verification. SLR (Simple Linear Regression) defines R 2 as a measure of goodnessoffit. We observed that R 2 is a good similarity measure for 1dimensional sequences. However, many kinds of sequences are multidimens ..."
Abstract

Cited by 11 (2 self)
 Add to MetaCart
ER 2 (Extended Rsquared) is proposed as a similarity measure for online signature verification. SLR (Simple Linear Regression) defines R 2 as a measure of goodnessoffit. We observed that R 2 is a good similarity measure for 1dimensional sequences. However, many kinds of sequences are multidimensional, such as online signature sequences, 2D curves, etc. Therefore, we extend R 2 to ER 2 for multidimensional sequence matching. Coupled with optimal alignment, ER 2 outperforms DTWbased curve matching on online signature verification. 1.
A Parallel CuttingPlane Algorithm for the Vehicle Routing Problem With Time Windows
, 1999
"... In the vehicle routing problem with time windows a number of identical vehicles must be routed to and from a depot to cover a given set of customers, each of whom has a specified time interval indicating when they are available for service. Each customer also has a known demand, and a vehicle may on ..."
Abstract

Cited by 11 (1 self)
 Add to MetaCart
In the vehicle routing problem with time windows a number of identical vehicles must be routed to and from a depot to cover a given set of customers, each of whom has a specified time interval indicating when they are available for service. Each customer also has a known demand, and a vehicle may only serve the customers on a route if the total demand does not exceed the capacity of the vehicle. The most effective solution method proposed to date for this problem is due to Kohl, Desrosiers, Madsen, Solomon, and Soumis. Their algorithm uses a cuttingplane approach followed by a branchand bound search with column generation, where the columns of the LP relaxation represent routes of individual vehicles. We describe a new implementation of their method, using Karger's randomized minimumcut algorithm to generate cutting planes. The standard benchmark in this area is a set of 87 problem instances generated in 1984 by M. Solomon; making using of parallel processing in both the cuttingpla...
Revisiting the Omitted Variables Argument: Substantive vs. Statistical Adequacy
 FORTHCOMING IN THE JOURNAL OF ECONOMIC METHODOLOGY
"... The problem of omitted variables is commonly viewed as a statistical misspecification issue which renders the inference concerning the influence of Xt on yt unreliable, due to the exclusion of certain relevant factors Wt: That is, omitting certain potentially important factors Wt may confound the in ..."
Abstract

Cited by 10 (8 self)
 Add to MetaCart
The problem of omitted variables is commonly viewed as a statistical misspecification issue which renders the inference concerning the influence of Xt on yt unreliable, due to the exclusion of certain relevant factors Wt: That is, omitting certain potentially important factors Wt may confound the influence of Xt on yt. The textbook omitted variables argument attempts to assess the seriousness of this unreliability using the sensitivity of the estimator b = (X  X) 1 X  y to the inclusion/exclusion of Wt; by tracing that effect to the potential bias/inconsistency of b: It is argued that the confounding problem is one of substantive inadequacy in so far as the potential error concerns subjectmatter, not statistical, information. Moreover, the textbook argument in terms of the sensitivity of point estimates provides a poor basis for addressing the confounding problem. The paper reframes the omitted variables question into a hypothesis testing problem, supplemented with a postdata evaluation of inference based on severe testing. It is shown that this testing perspective can deal effectively with assessing the problem of confounding raised by the omitted variables argument. The assessment of the confouding effect using hypothesis testing is related to the conditional independence and faithfulness assumptions of graphical causal modeling.
Applications of Generalized Method of Moments Estimation
 JOURNAL OF ECONOMIC PERSPECTIVES—VOLUME 15, NUMBER 4—FALL 2001—PAGES 87–100
, 2001
"... The method of moments approach to parameter estimation dates back more than 100 years (Stigler, 1986). The notion of a moment is fundamental for describing features of a population. For example, the population mean (or population average), usually denoted �, is the moment that measures central tende ..."
Abstract

Cited by 8 (0 self)
 Add to MetaCart
The method of moments approach to parameter estimation dates back more than 100 years (Stigler, 1986). The notion of a moment is fundamental for describing features of a population. For example, the population mean (or population average), usually denoted �, is the moment that measures central tendency. If y is a random variable describing the population of interest, we also write the population mean as E ( y), the expected value or mean of y. (The mean of y is also called the first moment of y.) The population variance, usually denoted � 2 or Var ( y), is defined as the second moment of y centered about its mean: Var ( y) � E[(y � �) 2]. The variance, also called the second central moment, is widely used as a measure of spread in a distribution. Since we can rarely obtain information on an entire population, we use a sample from the population to estimate population moments. If { y i: i � 1,...,n} is a sample from a population with mean �, the method of moments estimator of � is just the sample average: y � � ( y 1 � y 2 �... � y n)/n. Under random sampling, y � is unbiased and consistent for � regardless of other features of the underlying population. Further, as long as the population variance is finite, y � is the best linear unbiased estimator of �. An unbiased and consistent estimator of � 2 also exists and is called the sample variance, usually denoted s 2. 1 Method of moments estimation applies in more complicated situations. For example, suppose that in a population with � � 0, we know that the variance is three times the mean: � 2 � 3�. The sample average, y�, is unbiased and consistent 1 See Wooldridge (2000, appendix C) for more discussion of the sample mean and sample variance as method of moments estimators.
The effects of shilling on final bid prices in online auctions
 Electronic Commerce Research and Applications
, 2005
"... An increasing number of reports of online auction fraud are of growing concern to auction operators and participants. In this research, we discuss reserve price shilling, where a bidder shills in order to avoid paying auction house fees, rather than to drive up the price of the final bid. We examine ..."
Abstract

Cited by 7 (1 self)
 Add to MetaCart
An increasing number of reports of online auction fraud are of growing concern to auction operators and participants. In this research, we discuss reserve price shilling, where a bidder shills in order to avoid paying auction house fees, rather than to drive up the price of the final bid. We examine the effect that premium bids, since they are linked with reserve price shill bids, have upon the final selling price. We use 10,260 eBay auctions during April 2001, and identify 1,389 auctions involving 493 sellers and 1,314 involved in concurrent auctions that involving the exact same item. We find that premium bidding occurs 23 % of the time, in 263 of the 1,389 auctions. Using a theoretical perspective involving valuation signals, we show that other bidders may view high bids as signals that an item is worth more. Thus, they may be willing to pay more for the item than items that do not receive premium bids. The implications are disturbing in that sellers may be more motivated to enter a shill bid in order to drive up the final price in an online auction. We also examine and report on alternative hypotheses involving winner’s curse and the possibility of reserve price shill bids. Our results are developed in the context of a weighted least
AN ANALYSIS OF INCOME POVERTY EFFECTS IN CASH CROPPING ECONOMIES IN RURAL MOZAMBIQUE: BLENDING ECONOMETRIC AND ECONOMYWIDE MODELS By
, 2006
"... Contract farming is a pervasive institutional arrangement in cash cropping economies in Mozambique. Empirical evidence on its nature and, especially, the extent to which policies can generate broad based income growth and poverty reduction is lacking. This study investigates the rationale for persis ..."
Abstract

Cited by 4 (1 self)
 Add to MetaCart
Contract farming is a pervasive institutional arrangement in cash cropping economies in Mozambique. Empirical evidence on its nature and, especially, the extent to which policies can generate broad based income growth and poverty reduction is lacking. This study investigates the rationale for persistence, the determinants of farmer participation and performance in cotton and tobacco schemes (Essay One), and the economywide effects of expansion and shocks in cotton and tobacco sectors on poverty reduction in concession areas of the Zambezi valley of Mozambique (Essay Two). In the first essay, we find that in both sectors contract farming is an institutional response to widespread failure in input, credit and output markets and the absence of a functional public and market based service provision network. Two stage econometric procedures (testing for the existence of threshold effects in land holdings and educational attainment) indicate that in both areas participation in the schemes is driven by factor endowments, asset ownership and alternative income opportunities, and very little by demographic factors. Also, there are no returns to education in either sector; this result is consistent with previous research in Mozambique but surprising in an agronomically