Results 1  10
of
32
ATM Network Design And Optimization: A Multirate Loss Network Framework
 IEEE/ACM Transactions on Networking
, 1996
"... ATM network design and optimization at the calllevel may be formulated in the framework of multirate, circuitswitched, loss networks with effective bandwidth encapsulating celllevel behavior. Each service supported on the ATM network is characterized by a rate or bandwidth requirement. Future net ..."
Abstract

Cited by 62 (6 self)
 Add to MetaCart
ATM network design and optimization at the calllevel may be formulated in the framework of multirate, circuitswitched, loss networks with effective bandwidth encapsulating celllevel behavior. Each service supported on the ATM network is characterized by a rate or bandwidth requirement. Future networks will be characterized by links with very large capacities in circuits and by many rates. Various asymptotic results are given to reduce the attendant complexity of numerical calculations. A central element is a uniform asymptotic approximation (UAA) for link analyses. Moreover, a unified hybrid approach is given which allows asymptotic and nonasymptotic methods of calculations to be used cooperatively. Network loss probabilities are obtained by solving fixed point equations. A canonical problem of route and logical network design is considered. An optimization procedure is proposed, which is guided by gradients obtained by solving a system of equations for implied costs. A novel applic...
Informationtheoretic image formation
 IEEE Transactions on Information Theory
, 1998
"... Abstract — The emergent role of information theory in image formation is surveyed. Unlike the subject of informationtheoretic communication theory, informationtheoretic imaging is far from a mature subject. The possible role of information theory in problems of image formation is to provide a rigo ..."
Abstract

Cited by 28 (5 self)
 Add to MetaCart
Abstract — The emergent role of information theory in image formation is surveyed. Unlike the subject of informationtheoretic communication theory, informationtheoretic imaging is far from a mature subject. The possible role of information theory in problems of image formation is to provide a rigorous framework for defining the imaging problem, for defining measures of optimality used to form estimates of images, for addressing issues associated with the development of algorithms based on these optimality criteria, and for quantifying the quality of the approximations. The definition of the imaging problem consists of an appropriate model for the data and an appropriate model for the reproduction space, which is the space within which image estimates take values. Each problem statement has an associated optimality criterion that measures the overall quality of an estimate. The optimality criteria include maximizing the likelihood function and minimizing mean squared error for stochastic problems, and minimizing squared error and discrimination for deterministic problems. The development of algorithms is closely tied to the definition of the imaging problem and the associated optimality criterion. Algorithms with a strong informationtheoretic motivation are obtained by the method of expectation maximization. Related alternating minimization algorithms are discussed. In quantifying the quality of approximations, global and local measures are discussed. Global measures include the (mean) squared error and discrimination between an estimate and the truth, and probability of error for recognition or hypothesis testing problems. Local measures include Fisher information. Index Terms—Image analysis, image formation, image processing, image reconstruction, image restoration, imaging, inverse problems, maximumlikelihood estimation, pattern recognition. I.
A Hybrid Highorder Markov Chain Model for Computer Intrusion Detection
, 1999
"... A hybrid model based mostly on a highorder Markovchain and occasionally on an independence model is proposed for pro#ling the commandsequence of a computer user in order to identify a "signature behavior" for that user. Based on the model, an estimation procedure for such a signature behavior driv ..."
Abstract

Cited by 20 (3 self)
 Add to MetaCart
A hybrid model based mostly on a highorder Markovchain and occasionally on an independence model is proposed for pro#ling the commandsequence of a computer user in order to identify a "signature behavior" for that user. Based on the model, an estimation procedure for such a signature behavior driven by Maximum Likelihood #ML# considerations is devised. The formal ML estimates are numerically intractable, but the MLoptimization problem can be substituted by a linear inverse problem with positivity constraints #LININPOS#, for which the EM algorithm can be used as an equation solver to produce an approximate MLestimate. A user's commandsequence is then compared to his and others' estimated signaturebehavior in real time, by means of statistical hypothesis testing. A form of the likelihoodratio test is used to test if a given sequence of commands is from the proclaimed user, with the alternative hypothesis being masquerader user. Data from a reallife experiment, conducted at a research lab, is used to assess the method. Key Words: Anomaly Detection; Unix; Mixture Transition Distribution #MTD#; LININPOS; EM. 1
An EM approach to OD matrix estimation
, 1994
"... Consider a \black box " having I input channels and J output channels. Each arrival on an input channel gets routed through the black box and appears on an output channel. The system is monitored for a xed time period and a record is made of the number of arrivals on each input channel and the numbe ..."
Abstract

Cited by 18 (0 self)
 Add to MetaCart
Consider a \black box " having I input channels and J output channels. Each arrival on an input channel gets routed through the black box and appears on an output channel. The system is monitored for a xed time period and a record is made of the number of arrivals on each input channel and the number of departures on each output channel. The OD (originationdestination) matrix estimation problem is to estimate, for each i and j, the number of arrivals on channel i that depart on channel j. Weintroduce a Poisson stochastic model and employ the EM algorithm to produce high likelihood estimates. In the case of estimation based on observations over a single timeperiod, we analyze in detail the xed points of the EM algorithm showing that every vertex of a certain polytope of feasible matrices is a xed point and identifying a speci c interior xed point which is a saddle point for the likelihood function.
The Boolean Solution to the Congested IP Link Location Problem: Theory and Practice
 In Proc. IEEE INFOCOM
, 2007
"... Abstract — Like other problems in network tomography or traffic matrix estimation, the location of congested IP links from endtoend measurements requires solving a system of equations that relate the measurement outcomes with the variables representing the status of the IP links. In most networks, ..."
Abstract

Cited by 16 (4 self)
 Add to MetaCart
Abstract — Like other problems in network tomography or traffic matrix estimation, the location of congested IP links from endtoend measurements requires solving a system of equations that relate the measurement outcomes with the variables representing the status of the IP links. In most networks, this system of equations does not have a unique solution. To overcome this critical problem, current methods use the unrealistic assumption that all IP links have the same prior probability of being congested. We find that this assumption is not needed, because these probabilities can be uniquely identified from a small set of measurements by using properties of Boolean algebra. We can then use the learnt probabilities as priors to find rapidly the congested links at any time, with an order of magnitude gain in accuracy over existing algorithms. We validate our results both by simulation and real implementation in the PlanetLab network. I.
A Full Bayesian Approach for Inverse Problems
 in Maximum Entropy and Bayesian Methods
, 1996
"... The main object of this paper is to present some general concepts of Bayesian inference and more specifically the estimation of the hyperparameters in inverse problems. We consider a general linear situation where we are given some data y related to the unknown parameters a by y = Aa + n and where w ..."
Abstract

Cited by 15 (7 self)
 Add to MetaCart
The main object of this paper is to present some general concepts of Bayesian inference and more specifically the estimation of the hyperparameters in inverse problems. We consider a general linear situation where we are given some data y related to the unknown parameters a by y = Aa + n and where we can assign the probability laws p(a10), p(yl, ), p() and p(O). The main discussion is then how to infer a, 0 and fl either individually or any combinations of them. Different situations are considered and discussed. As an important example, we consider the case where 0 and fl are the precision parameters of the Gaussian laws to whom we assign Gamma priors and we propose some new and practical algorithms to estimate them simultaneously. Comparisons and links with other classical methods such as maximum likelihood are presented.
Conceptual clustering of heterogeneous distributed databases
 In Workshop on Ubiquitous Data Mining, PAKDD01
, 2001
"... Abstract. With increasingly more databases becoming available on the Internet, there is a growing opportunity to globalise knowledge discovery and learn general patterns, rather than restricting learning to specific databases from which the rules may not be generalisable. Clustering of distributed d ..."
Abstract

Cited by 11 (2 self)
 Add to MetaCart
Abstract. With increasingly more databases becoming available on the Internet, there is a growing opportunity to globalise knowledge discovery and learn general patterns, rather than restricting learning to specific databases from which the rules may not be generalisable. Clustering of distributed databases facilitates learning of new concepts that characterise common features of, and differences between, datasets. We are here concerned with clustering databases that hold aggregate count data on a set of attributes that have been classified according to heterogeneous classification schemes. Such aggregates are commonly used for summarising very large databases such as those encountered in data warehousing, largescale transaction management, and statistical databases. For measuring difference between aggregates we utilise two distance metrics: the Euclidean distance and the KullbackLeibler information divergence. A hybrid between KullbackLeibler and the Euclidean distance, which uses the former to learn the class probabilities and the latter as the corresponding distance measure, looks particularly promising both in terms of accuracy and scalability. These metrics are evaluated using synthetic data. Important applications of the work include the clustering of heterogeneous customer databases for the discovery of new marketing concepts and the clustering of medical databases for the discovery of new epidemiological concepts. 1.
Degraded Character Image Restoration
 In Proceedings of the Fifth Annual Symposium on Document Analysis and Image Retrieval
, 1996
"... The design and analysis of an algorithm for the restoration of degraded images of machineprinted characters is presented. The input is a set of degraded bilevel images of a single unknown character; the output is an approximation to the character 's ideal artwork. The algorithm seeks to minimize th ..."
Abstract

Cited by 9 (2 self)
 Add to MetaCart
The design and analysis of an algorithm for the restoration of degraded images of machineprinted characters is presented. The input is a set of degraded bilevel images of a single unknown character; the output is an approximation to the character 's ideal artwork. The algorithm seeks to minimize the discrepancy between the approximation and the ideal, measured as the worstcase Euclidean distance between their boundaries. We investigate a family of algorithms which superimpose the input images, add up the intensities at each point, and threshold the result. We show that, under degradations due to random spatial sampling error, significant asymptotic improvements can be achieved by suitably preprocessing each input image and postprocessing the final result. Experimental trials on special test shapes and Latin characters are discussed. 1 Introduction In the last few years, a variety of documentimage degradation models have been proposed, and their applications investigated [3]. Models...
A Study of Least Squares and Maximum Likelihood for Image Reconstruction in Positron Emission Tomography
, 1993
"... ..."
Parametric Deconvolution of Positive Spike Trains
 Annals of Statistics
, 2000
"... This paper describes a parametric deconvolution method(PDPS) appropriate for a particular class of signals which we call spikeconvolution models. These models arise when a sparse spike trainDirac deltas according to our mathematical treatmentis convolved with a fixed pointspread function, an ..."
Abstract

Cited by 7 (2 self)
 Add to MetaCart
This paper describes a parametric deconvolution method(PDPS) appropriate for a particular class of signals which we call spikeconvolution models. These models arise when a sparse spike trainDirac deltas according to our mathematical treatmentis convolved with a fixed pointspread function, and additive noise or measurement error is superimposed. We view deconvolution as an estimation problem, regarding the locations and heights of the underlying spikes, as well as the baseline and the measurement error variance as unknown parameters. Our estimation scheme consists of two parts: model fitting and model selection. To fit a spikeconvolution model of a specific order, we estimate peak locations by trigonometric moments, and heights and the baseline by least squares. The model selection procedure has two stages. Its first stage is so designed that we expect a model of a somewhat larger order than the truth to be selected. In the second stage, the final model is obtained using backwar...