Results 1  10
of
202
2002a), “Statistical Analysis of a Telephone Call Center: A Queueing Science Perspective,” technical report, University of Pennsylvania, downloadable at http://iew3.technion.ac.il/serveng/References/references.html
"... A call center is a service network in which agents provide telephonebased services. Customers who seek these services are delayed in telequeues. This article summarizes an analysis of a unique record of call center operations. The data comprise a complete operational history of a small banking cal ..."
Abstract

Cited by 148 (24 self)
 Add to MetaCart
(Show Context)
A call center is a service network in which agents provide telephonebased services. Customers who seek these services are delayed in telequeues. This article summarizes an analysis of a unique record of call center operations. The data comprise a complete operational history of a small banking call center, call by call, over a full year. Taking the perspective of queueing theory, we decompose the service process into three fundamental components: arrivals, customer patience, and service durations. Each component involves different basic mathematical structures and requires a different style of statistical analysis. Some of the key empirical results are sketched, along with descriptions of the varied techniques required. Several statistical techniques are developed for analysis of the basic components. One of these techniques is a test that a point process is a Poisson process. Another involves estimation of the mean function in a nonparametric regression with lognormal errors. A new graphical technique is introduced for nonparametric hazard rate estimation with censored data. Models are developed and implemented for forecasting of Poisson arrival rates. Finally, the article surveys how the characteristics deduced from the statistical analyses form the building blocks for theoretically interesting and practically useful mathematical models for call center operations.
Boosting with the L_2Loss: Regression and Classification
, 2001
"... This paper investigates a variant of boosting, L 2 Boost, which is constructed from a functional gradient descent algorithm with the L 2 loss function. Based on an explicit stagewise re tting expression of L 2 Boost, the case of (symmetric) linear weak learners is studied in detail in both regressi ..."
Abstract

Cited by 140 (15 self)
 Add to MetaCart
This paper investigates a variant of boosting, L 2 Boost, which is constructed from a functional gradient descent algorithm with the L 2 loss function. Based on an explicit stagewise re tting expression of L 2 Boost, the case of (symmetric) linear weak learners is studied in detail in both regression and twoclass classification. In particular, with the boosting iteration m working as the smoothing or regularization parameter, a new exponential biasvariance trade off is found with the variance (complexity) term bounded as m tends to infinity. When the weak learner is a smoothing spline, an optimal rate of convergence result holds for both regression and twoclass classification. And this boosted smoothing spline adapts to higher order, unknown smoothness. Moreover, a simple expansion of the 01 loss function is derived to reveal the importance of the decision boundary, bias reduction, and impossibility of an additive biasvariance decomposition in classification. Finally, simulation and real data set results are obtained to demonstrate the attractiveness of L 2 Boost, particularly with a novel componentwise cubic smoothing spline as an effective and practical weak learner.
Internet Tomography
 IEEE Signal Processing Magazine
, 2002
"... Today's Internet is a massive, distributed network which continues to explode in size as ecommerce and related activities grow. The heterogeneous and largely unregulated structure of the Internet renders tasks such as dynamic routing, optimized service provision, service level verification, and ..."
Abstract

Cited by 120 (13 self)
 Add to MetaCart
(Show Context)
Today's Internet is a massive, distributed network which continues to explode in size as ecommerce and related activities grow. The heterogeneous and largely unregulated structure of the Internet renders tasks such as dynamic routing, optimized service provision, service level verification, and detection of anomalous/malicious behavior increasingly challenging tasks. The problem is compounded by the fact that one cannot rely on the cooperation of individual servers and routers to aid in the collection of network traffic measurements vital for these tasks. In many ways, network monitoring and inference problems bear a strong resemblance to other "inverse problems" in which key aspects of a system are not directly observable. Familiar signal processing problems such as tomographic image reconstruction, system identification, and array processing all have interesting interpretations in the networking context. This article introduces the new field of network tomography, a field which we believe will benefit greatly from the wealth of signal processing theory and algorithms.
Normalization and analysis of DNA microarray data by selfconsistency and local regression
"... With the advent of DNA hybridization microarrays comes the remarkable ability, in principle, to simultaneously monitor the expression levels of large numbers of genes. The quantitative comparison of 2 or more microarrays can reveal, for example, the distinct patterns of gene expression that dene die ..."
Abstract

Cited by 63 (0 self)
 Add to MetaCart
With the advent of DNA hybridization microarrays comes the remarkable ability, in principle, to simultaneously monitor the expression levels of large numbers of genes. The quantitative comparison of 2 or more microarrays can reveal, for example, the distinct patterns of gene expression that dene dierent cellular phenotypes or the genes induced in the cellular response to insult or changing environmental conditions. Normalization of the measured intensities is a prerequisite of such comparisons, and indeed of any statistical analysis, yet little attention has been paid to its systematic study. The most straightforward normalization techniques in use rest on the implicit assumption of linear response between true expression level and output intensity. We nd that these assumptions are not generally met and that these simple methods can be improved. We have developed a robust semiparametric normalization technique based upon the assumption that the large majority of genes will not have...
Correlated label propagation with application to multilabel learning
 IN: CVPR ’06: PROCEEDINGS OF THE 2006 IEEE COMPUTER SOCIETY CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION
, 2006
"... Many computer vision applications, such as scene analysis and medical image interpretation, are illsuited for traditional classification where each image can only be associated with a single class. This has stimulated recent work in multilabel learning where a given image can be tagged with multip ..."
Abstract

Cited by 30 (0 self)
 Add to MetaCart
(Show Context)
Many computer vision applications, such as scene analysis and medical image interpretation, are illsuited for traditional classification where each image can only be associated with a single class. This has stimulated recent work in multilabel learning where a given image can be tagged with multiple class labels. A serious problem with existing approaches is that they are unable to exploit correlations between class labels. This paper presents a novel framework for multilabel learning termed Correlated Label Propagation (CLP) that explicitly models interactions between labels in an efficient manner. As in standard label propagation, labels attached to training data points are propagated to test data points; however, unlike standard algorithms that treat each label independently, CLP simultaneously copropagates multiple labels. Existing work eschews such an approach since naive algorithms for label copropagation are intractable. We present an algorithm based on properties of submodular functions that efficiently finds an optimal solution. Our experiments demonstrate that CLP leads to significant gains in precision/recall against standard techniques on two realworld computer vision tasks involving several hundred labels.
Locally Weighted Naive Bayes
 Proceedings of the Conference on Uncertainty in Artificial Intelligence
, 2003
"... Despite its simplicity, the naive Bayes classifier has surprised machine learning researchers by exhibiting good performance on a variety of learning problems. Encouraged by these results, researchers have looked to overcome naive Bayes' primary weakness  attribute independence  and improve ..."
Abstract

Cited by 27 (1 self)
 Add to MetaCart
Despite its simplicity, the naive Bayes classifier has surprised machine learning researchers by exhibiting good performance on a variety of learning problems. Encouraged by these results, researchers have looked to overcome naive Bayes' primary weakness  attribute independence  and improve the performance of the algorithm. This paper presents a locally weighted version of naive Bayes that relaxes the independence assumption by learning local models at prediction time. Experimental results show that locally weighted naive Bayes rarely degrades accuracy compared to standard naive Bayes and, in many cases, improves accuracy dramatically. The main advantage of this method compared to other techniques for enhancing naive Bayes is its conceptual and computational simplicity.
A spatially adaptive nonparametric regression image deblurring
 IEEE TRANS. IMAGE PROCESS
, 2005
"... (slides) ..."
(Show Context)