Results 1  10
of
25
A ReExamination of Text Categorization Methods
, 1999
"... This paper reports a controlled study with statistical significance tests on five text categorization methods: the Support Vector Machines (SVM), a kNearest Neighbor (kNN) classifier, a neural network (NNet) approach, the Linear Leastsquares Fit (LLSF) mapping and a NaiveBayes (NB) classifier. We f ..."
Abstract

Cited by 716 (21 self)
 Add to MetaCart
(Show Context)
This paper reports a controlled study with statistical significance tests on five text categorization methods: the Support Vector Machines (SVM), a kNearest Neighbor (kNN) classifier, a neural network (NNet) approach, the Linear Leastsquares Fit (LLSF) mapping and a NaiveBayes (NB) classifier. We focus on the robustness of these methods in dealing with a skewed category distribution, and their performance as function of the trainingset category frequency. Our results show that SVM, kNN and LLSF significantly outperform NNet and NB when the number of positive training instances per category are small (less than ten), and that all the methods perform comparably when the categories are sufficiently common (over 300 instances).
An Optimality Proof of the LRUK Page Replacement Algorithm
 Journal of the ACM
, 1996
"... This paper analyzes a recently published algorithm for page replacement in hierarchical paged memory systems [OOW93]. The algorithm is called the LRUK method, and reduces to the wellknown LRU (Least Recently Used) method for K=1. Previous work [OOW93, WHMZ94, JS94] has shown the effectiveness for ..."
Abstract

Cited by 28 (1 self)
 Add to MetaCart
This paper analyzes a recently published algorithm for page replacement in hierarchical paged memory systems [OOW93]. The algorithm is called the LRUK method, and reduces to the wellknown LRU (Least Recently Used) method for K=1. Previous work [OOW93, WHMZ94, JS94] has shown the effectiveness for K > 1 by simulation, especially in the most common case of K = 2. The basic idea in LRUK is to keep track of the times of the last K references to memory pages, and to use this statistical information to rankorder the pages as to their expected future behavior. Based on this the page replacement policy decision is made: which memoryresident page to replace when a newly accessed page must be read into memory. In the current paper we prove, under the assumptions of the independent reference model, that LRUK is optimal among all replacement algorithms that can be based on information about the K most recent references to each page. The proof uses the Bayesian formula to relate the space of ...
A new approach for nonlinear distortion correction in endoscopic images based on least squares estimation
 IEEE Trans. Med. Imag
, 1999
"... Abstract—Images captured with a typical endoscope show spatial distortion, which necessitates distortion correction for subsequent analysis. In this paper, a new methodology based on least squares estimation is proposed to correct the nonlinear distortion in the endoscopic images. A mathematical mod ..."
Abstract

Cited by 15 (0 self)
 Add to MetaCart
(Show Context)
Abstract—Images captured with a typical endoscope show spatial distortion, which necessitates distortion correction for subsequent analysis. In this paper, a new methodology based on least squares estimation is proposed to correct the nonlinear distortion in the endoscopic images. A mathematical model based on polynomial mapping is used to map the images from distorted image space onto the corrected image space. The model parameters include the polynomial coefficients, distortion center, and corrected center. The proposed method utilizes a line search approach of global convergence for the iterative procedure to obtain the optimum expansion coefficients. A new technique to find the distortion center of the image based on curvature criterion is presented. A dualstep approach comprising token matching and integrated neighborhood search is also proposed for accurate extraction of the centers of the dots contained in a rectangular grid, used for the model parameter estimation. The model parameters were verified with different grid patterns. The distortioncorrection model is applied to several gastrointestinal images and the results are presented. The proposed technique provides highspeed response and forms a key step toward online camera calibration, which is required for accurate quantitative analysis of the images. Index Terms—Camera calibration, distortion correction, endoscopy, expansion polynomial. I.
Endogenous monocyte chemoattractant protein1 recruits monocytes in the zymosan peritonitis
, 1998
"... the zymosan peritonitis model ..."
(Show Context)
Enabling High Data Availability in a DHT
 Proc. of Int. Workshop on Grid and P2P Computing Impacts on Large Scale Heterogeneous Distributed Database Systems (GLOBE
, 2005
"... Many decentralized and peertopeer applications require some sort of data management. Besides P2P lesharing, there are already scenarios (e.g. BRICKS project [3]) that need management of nergrained objects including updates and, keeping them highly available in very dynamic communities of peer ..."
Abstract

Cited by 9 (4 self)
 Add to MetaCart
(Show Context)
Many decentralized and peertopeer applications require some sort of data management. Besides P2P lesharing, there are already scenarios (e.g. BRICKS project [3]) that need management of nergrained objects including updates and, keeping them highly available in very dynamic communities of peers. In order to achieve project goals and fulll the requirements, a decentralized/P2P XML storage on top of a DHT (Distributed Hash Table) overlay has been proposed [6]. Unfortunately, DHTs do not provide any guarantees that data will be highly available all the time. A selfmanaged approach is proposed where availability is stochastically guaranteed by using a replication protocol. The protocol recreates periodically missing replicas dependent on the availability of peers. We are able to minimize generated costs for requested data availability. The protocol is fully decentralized and adapts itself on changes in community maintaining the requested availability. Finally, the approach is evaluated and compared with replication mechanisms embedded in other decentralized storages. 1
Adaptive packet sampling for flow volume measurement
 Computer Communication Review
, 2002
"... Traffic measurement and monitoring are an important component of network management and traffic engineering. With highspeed Internet backbone links, efficient and effective packet sampling techniques for traffic measurement and monitoring are not only desirable, but also increasingly becoming a nec ..."
Abstract

Cited by 8 (0 self)
 Add to MetaCart
(Show Context)
Traffic measurement and monitoring are an important component of network management and traffic engineering. With highspeed Internet backbone links, efficient and effective packet sampling techniques for traffic measurement and monitoring are not only desirable, but also increasingly becoming a necessity. Since the utility of sampling depends on the accuracy and economy of measurement, it is important to control sampling error. In this paper we propose and analyze an adaptive, stratified random packet sampling technique for flowlevel traffic measurement. In particular, we address the theoretical and practical issues involved. Through theoretical studies and experiments, we demonstrate that the proposed sampling technique provides unbiased estimation of flow size with controllable error bound, in terms of both packet and byte counts for elephant flows, while avoiding excessive oversampling. I.
Geometric Tomography and Local Stereology
"... A substantial portion of E. Lutwak's dual BrunnMinkowski theory, originally applicable only to starshaped sets, is extended to the class of bounded Borel sets. The extension is motivated by an important application to local stereology, a collection of stereological designs based on section ..."
Abstract

Cited by 4 (4 self)
 Add to MetaCart
A substantial portion of E. Lutwak's dual BrunnMinkowski theory, originally applicable only to starshaped sets, is extended to the class of bounded Borel sets. The extension is motivated by an important application to local stereology, a collection of stereological designs based on sections through a xed reference point that has achieved signi cant medical results in neuroscience and cancer grading.
Mobile Robot Localization Using Pattern Classification Techniques
, 1993
"... This thesis describes a methodology for coarse position estimation of a mobile robot within an indoor setting. The approach is divided into two phases: exploration and navigation. In the exploration phase, the mobile robot is allowed to sense the environment for the purpose of mapping and thus &qu ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
This thesis describes a methodology for coarse position estimation of a mobile robot within an indoor setting. The approach is divided into two phases: exploration and navigation. In the exploration phase, the mobile robot is allowed to sense the environment for the purpose of mapping and thus "learning" its unknown surroundings. In the navigation phase, the robot senses the surroundings and compares this information with its learned maps for the purpose of locating itself in the workspace. We solve the task of coarselevel mobile robot localization via pattern classification of gridbased maps of important or interesting workspace regions. Using datasets representing 10 different rooms and doorways, we estimate a 94% recognition rate of the rooms and a 98% recognition rate of the doorways. We conclude that coarse position estimation is possible through classification of gridbased maps in indoor...
On the Accuracy and Overhead of
"... Abstract — Traffic measurement and monitoring are an important first step for network management and traffic engineering. With highspeed Internet backbone links, efficient and effective packet sampling are not only desirable, but also increasingly becoming a necessity. The Sampled NetFlow [10] is a ..."
Abstract
 Add to MetaCart
(Show Context)
Abstract — Traffic measurement and monitoring are an important first step for network management and traffic engineering. With highspeed Internet backbone links, efficient and effective packet sampling are not only desirable, but also increasingly becoming a necessity. The Sampled NetFlow [10] is a Cisco router’s traffic measurement functionality with static packet sampling for high speed links. Since the utility of sampling depends on the accuracy and economy of measurement, it is important to understand sampling error and measurement overhead. In this paper, we first discuss fundamental limitations of sampling techniques used in the Sampled NetFlow. We assess the accuracy of the Sampled NetFlow by comparing its output with complete packet traces [8] from an operational router. We also show the overheads involved in the Sampled NetFlow. We find that the Sampled NetFlow performs correctly without incurring dramatic overhead during our experiments. However, a care should be taken in its use, since the overhead is linearly proportional to the number of flows recored. I.
Macrophages, Suppress Cytokine Release, and Inhibit Neutrophil Migration in Acute Experimental Inflammation
, 2013
"... This article cites 38 articles, 10 of which you can access for free at: ..."
(Show Context)