Results 1  10
of
18
Empirical Performance Evaluation Methodology and Its Application to Page Segmentation Algorithms
 IEEE Transactions on Pattern Analysis and Machine Intelligence
, 2001
"... this paper, we use the following fivestep methodology to quantitatively compare the performance of page segmentation algorithms: 1) First, we create mutually exclusive training and test data sets with groundtruth, 2) we then select a meaningful and computable performance metric, 3) an optimizatio ..."
Abstract

Cited by 34 (5 self)
 Add to MetaCart
(Show Context)
this paper, we use the following fivestep methodology to quantitatively compare the performance of page segmentation algorithms: 1) First, we create mutually exclusive training and test data sets with groundtruth, 2) we then select a meaningful and computable performance metric, 3) an optimization procedure is then used to search automatically for the optimal parameter values of the segmentation algorithms on the training data set, 4) the segmentation algorithms are then evaluated on the test data set, and, finally, 5) a statistical and error analysis is performed to give the statistical significance of the experimental results. In particular, instead of the ad hoc and manual approach typically used in the literature for training algorithms, we pose the automatic training of algorithms as an optimization problem and use the Simplex algorithm to search for the optimal parameter value. A pairedmodel statistical analysis and an error analysis are then conducted to provide confidence intervals for the experimental results of the algorithms. This methodology is applied to the evaluation of five page segmentation algorithms of which, three are representative research algorithms and the other two are wellknown commercial products, on 978 images from the University of Washington III data set. It is found that the performance indices (average textline accuracy) of the Voronoi, Docstrum, and Caere segmentation algorithms are not significantly different from each other, but they are significantly better than that of ScanSoft's segmentation algorithm, which, in turn, is significantly better than that of XY cut
Modeling multiyear observations of soil moisture recharge in the semiarid American Southwest. Water Resour. Res
, 2000
"... Abstract. The multiyear, root zone soil moisture redistribution characteristics in a semiarid rangeland in southeastern Arizona were evaluated to determine the magnitude and variability of deepprofile, wintertime soil moisture recharge. Intermittent observations from 1990 to 1998 of average volumet ..."
Abstract

Cited by 8 (1 self)
 Add to MetaCart
Abstract. The multiyear, root zone soil moisture redistribution characteristics in a semiarid rangeland in southeastern Arizona were evaluated to determine the magnitude and variability of deepprofile, wintertime soil moisture recharge. Intermittent observations from 1990 to 1998 of average volumetric soil moisture under shrub and grass cover showed that significant recharge beyond 0.30 m principally occurs only in the wintertime when the vegetation is senescent and does not use the infiltrating water. Using the physically based, variably saturated flow model HYDRUS, wintertime observations were modeled to determine the recharge of soil moisture at different depth intervals in the vadose zone. Two approaches were carried out to estimate the soil model parameters. The first was to use basic soils data from detailed profile descriptions in conjunction with pedotransfer functions. The second parameter estimation strategy was to use an automatic parameter search algorithm to find the optimal soil parameters that minimize the error between the modelcomputed volumetric water content and observations. Automatic calibration of the model was performed using the shuffled complex evolution algorithm (SCEUA), and it proved possible to satisfactorily describe the vadose zone observations using a simplified description of the soil profile with optimal model parameters. Simulations with the optimized model indicate that significant recharge of vadose zone does occur well beyond 0.30 m in winter but that such recharge is highly variable from year to year and appears correlated with El Niño episodes. This water could serve as a source of plant water for deeperrooted plants that are active during the subsequent spring season, thereby exploiting a niche that the more abundant, shallowerrooted plants that are active during the summer rainy season do not. However, the yeartoyear variability of the winter precipitation and consequent deep soil moisture recharge indicates that the deeperrooted vegetation in this region must retain the ability to obtain moisture from the near surface in order to meet its water demands if necessary. 1.
An integrated computational model of multiparty electoral competition
, 2000
"... Abstract. Most theoretic models of multiparty electoral competition make the assumption that party leaders are motivated to maximize their vote share or seat share. In pluralityrule systems this is a sensible assumption. However, in proportional representation systems, this assumption is questionab ..."
Abstract

Cited by 4 (0 self)
 Add to MetaCart
Abstract. Most theoretic models of multiparty electoral competition make the assumption that party leaders are motivated to maximize their vote share or seat share. In pluralityrule systems this is a sensible assumption. However, in proportional representation systems, this assumption is questionable since the ability to make public policy is not strictly increasing in vote shares or seat shares. We present a theoretic model in which party leaders choose electoral declarations with an eye toward the expected policy outcome of the coalition bargaining game induced by the party declarations and the parties’ beliefs about citizens ’ voting behavior. To test this model, we turn to data from the 1989 Dutch parliamentary election. We use Markov chain Monte Carlo methods to estimate the parties ’ beliefs about mass voting behavior and to average over measurement uncertainty and missing data. Due to the complexity of the parties ’ objective functions and the uncertainty in objective function estimates, equilibria are found numerically. Unlike previous models of multiparty electoral competition, the equilibrium results are consistent with the empirical declarations of the four major Dutch parties. Key words and phrases: Monte Carlo method, voting behavior, electoral strategy, coalition formation. 1.
Evaluation And Design Of Benchmark Suites
 StateoftheArt in Performance Modeling and Simulations: Theory, Techniques, and Tutorials, Chapter 12, Gordon and
, 1996
"... INTRODUCTION Benchmark suites are most frequently designed for industrial evaluation of competitive computer systems and networks. Examples of such benchmark suites include SPEC, TPC, GPC, PERFECT Club Benchmarks, AIM benchmarks, and others. In addition to benchmark suites sponsored by consortia of ..."
Abstract
 Add to MetaCart
(Show Context)
INTRODUCTION Benchmark suites are most frequently designed for industrial evaluation of competitive computer systems and networks. Examples of such benchmark suites include SPEC, TPC, GPC, PERFECT Club Benchmarks, AIM benchmarks, and others. In addition to benchmark suites sponsored by consortia of computer industry there are various collections of benchmarks designed by research organizations, companies, computer magazines, and individuals for benchmarking specific hardware and software systems. Examples of such benchmark suites include database benchmarks 6 , supercomputer benchmarks (Livermore loops 7 , NAS Parallel Benchmarks 8 , Lisp benchmarks 9 , Prolog benchmarks 10 , and many others 11 . In the majority of cases benchmark workloads are selected from a specifi
Tough Love and Intergenerational Altruism ∗
, 2008
"... This paper develops and studies a tough love model of intergenerational altruism. We model tough love by modifying the BarroBecker standard altruism model in two ways. First, the child’s discount factor is endogenously determined, so that low consumption at young age leads to a higher discount fact ..."
Abstract
 Add to MetaCart
This paper develops and studies a tough love model of intergenerational altruism. We model tough love by modifying the BarroBecker standard altruism model in two ways. First, the child’s discount factor is endogenously determined, so that low consumption at young age leads to a higher discount factor later in her life. Second, the parent evaluates the child’s lifetime utility with a constant high discount factor. The tough love model predicts that transfers from the parent will fall when the child’s discount factor falls. This is in contrast with the predictions of the standard altruism model that transfers from parents are independent of exogenous changes in the child’s discount factor. I
unknown title
, 2010
"... Appendix 1: Agency in product range choice with di¤erent transfer pricing methods When the change in brandcountry revenues from including product j rather than j 0 falls between the right and left hand side of inequality (4) then the local manager will choose to include product j when the …rm would ..."
Abstract
 Add to MetaCart
Appendix 1: Agency in product range choice with di¤erent transfer pricing methods When the change in brandcountry revenues from including product j rather than j 0 falls between the right and left hand side of inequality (4) then the local manager will choose to include product j when the …rm would rather he included product j 0. This inequality is: F j + X mcjqj;c
TABLE OF CONTENT
"... Institutt for informasjons og medievitenskap Schema evolution with homonymy conflict resolution ..."
Abstract
 Add to MetaCart
Institutt for informasjons og medievitenskap Schema evolution with homonymy conflict resolution
MINIMUM DISTANCE ESTIMATION OF LOSS DISTRIBUTIONS
"... Loss distributions have a number of uses in the pricing and reserving of casualty insurance. Many authors have recommended maximum likelihood for the estimation of the parameters. It has the advantages of asymptotic optimality (in the sense of mean square error) and applicabiiity (the likelihood f ..."
Abstract
 Add to MetaCart
(Show Context)
Loss distributions have a number of uses in the pricing and reserving of casualty insurance. Many authors have recommended maximum likelihood for the estimation of the parameters. It has the advantages of asymptotic optimality (in the sense of mean square error) and applicabiiity (the likelihood function can always be written). Also, it is possible to estimate the variance of the estimate, a useful rool in assessing the accuracy of any results. The only disadvantage of maximum likelihood is thar the objective function does not relate to the actuarial problem being investigated. Minimum distance estimates can be tailored fo reflect the goals of the analysis and, as such, should give more appropriate answers. The purpose of this paper is to demonstrate that these estimates hare rhe second and third desirable qualities bt,ith maximum likelihood. 1, DEFINITIONS, NOTATION, AND AGENDA We start with a definition of a minimum distance estimate. Let G(c; fl) be any function of c that is uniquely related to AC; 6), the probability density function (pdf) of the population. By uniquely related we mean that if you know A you can obtain G and vice versa. Call G the model functional. Let f,(c) be the empirical density. It assigns probability l/n to each of the n observations in the sample. Let G,(c) be found from f, in the same way that G is from f. Call G,, the empirical functional. The objective function is