Results 11  20
of
489
Quantitative Analysis of the Leakage of Confidential Data
, 2001
"... Basic information theory is used to analyse the amount of confidential information which may be leaked by programs written in a very simple imperative language. In particular, a detailed analysis is given of the possible leakage due to equality tests and if statements. The analysis is presented as a ..."
Abstract

Cited by 65 (11 self)
 Add to MetaCart
(Show Context)
Basic information theory is used to analyse the amount of confidential information which may be leaked by programs written in a very simple imperative language. In particular, a detailed analysis is given of the possible leakage due to equality tests and if statements. The analysis is presented as a set of syntaxdirected inference rules and can readily be automated.
On the Effectiveness of Secret Key Extraction from Wireless Signal Strength in Real Environments
, 2009
"... We evaluate the effectiveness of secret key extraction, for private communication between two wireless devices, from the received signal strength (RSS) variations on the wireless channel between the two devices. We use real world measurements of RSS in a variety of environments and settings. Our exp ..."
Abstract

Cited by 65 (2 self)
 Add to MetaCart
(Show Context)
We evaluate the effectiveness of secret key extraction, for private communication between two wireless devices, from the received signal strength (RSS) variations on the wireless channel between the two devices. We use real world measurements of RSS in a variety of environments and settings. Our experimental results show that (i) in certain environments, due to lack of variations in the wireless channel, the extracted bits have very low entropy making these bits unsuitable for a secret key, (ii) an adversary can cause predictable key generation in these static environments, and (iii) in dynamic scenarios where the two devices are mobile, and/or where there is a significant movement in the environment, high entropy bits are obtained fairly quickly. Building on the strengths of existing secret key extraction approaches, we develop an environment adaptive secret key generation scheme that uses an adaptive lossy quantizer in conjunction with Cascadebased information reconciliation [7] and privacy amplification [14]. Our measurements show that our scheme, in comparison to the existing ones that we evaluate, performs the best in terms of generating high entropy bits at a high bit rate. The secret key bit streams generated by our scheme also pass the randomness tests of the NIST test suite [21] that we conduct.
Enhanced Word Clustering for Hierarchical Text Classification
, 2002
"... In this paper we propose a new informationtheoretic divisive algorithm for word clustering applied to text classification. In previous work, such "distributional clustering" of features has been found to achieve improvements over feature selection in terms of classification accuracy, espe ..."
Abstract

Cited by 54 (2 self)
 Add to MetaCart
(Show Context)
In this paper we propose a new informationtheoretic divisive algorithm for word clustering applied to text classification. In previous work, such "distributional clustering" of features has been found to achieve improvements over feature selection in terms of classification accuracy, especially at lower number of features [2, 28]. However the existing clustering techniques are agglomerative in nature and result in (i) suboptimal word clusters and (ii) high computational cost. In order to explicitly capture the optimality of word clusters in an information theoretic framework, we first derive a global criterion for feature clustering. We then present a fast, divisive algorithm that monotonically decreases this objective function value, thus converging to a local minimum. We show that our algorithm minimizes the "withincluster JensenShannon divergence" while simultaneously maximizing the "betweencluster JensenShannon divergence". In comparison to the previously proposed agglomerative strategies our divisive algorithm achieves higher classification accuracy especially at lower number of features. We further show that feature clustering is an effective technique for building smaller class models in hierarchical classification. We present detailed experimental results using Naive Bayes and Support Vector Machines on the 20 Newsgroups data set and a 3level hierarchy of HTML documents collected from Dmoz Open Directory.
Delaybounded packet scheduling of bursty traffic over wireless channels
 IEEE Transactions on Information Theory
"... Abstract—In this paper, we study minimal power transmission of bursty sources over wireless channels with constraints on mean queuing delay. The power minimizing schedulers adapt power and rate of transmission based on the queue and channel state. We show that packet scheduling based on queue state ..."
Abstract

Cited by 44 (3 self)
 Add to MetaCart
Abstract—In this paper, we study minimal power transmission of bursty sources over wireless channels with constraints on mean queuing delay. The power minimizing schedulers adapt power and rate of transmission based on the queue and channel state. We show that packet scheduling based on queue state can be used to trade queuing delay with transmission power, even on additive white Gaussian noise (AWGN) channels. Our extensive simulations show that small increases in average delay can lead to substantial savings in transmission power, thereby providing another avenue for mobile devices to save on battery power. We propose a lowcomplexity scheduler that has nearoptimal performance. We also construct a variablerate quadrature amplitude modulation (QAM)based transmission scheme to show the benefits of the proposed formulation in a practical communication system. Power optimal schedulers with absolute packet delay constraints are also studied and their performance is evaluated via simulations. Index Terms—Packet scheduling, power control, queuing delay, traffic regulation, wireless channels. I.
An EdgebreakerBased Efficient Compression Scheme for Regular Meshes
, 2000
"... One of the most natural measures of regularity of a triangular mesh homeomorphic to the twodimensional sphere is the fraction of its vertices having degree 6. We construct a lineartime connectivity compression scheme build upon Edgebreaker which explicitly takes advantage of regularity and prove r ..."
Abstract

Cited by 43 (12 self)
 Add to MetaCart
(Show Context)
One of the most natural measures of regularity of a triangular mesh homeomorphic to the twodimensional sphere is the fraction of its vertices having degree 6. We construct a lineartime connectivity compression scheme build upon Edgebreaker which explicitly takes advantage of regularity and prove rigorously that, for suciently large and regular meshes, it produces encodings not longer than 0:811 bits per triangle: 50% below the informationtheoretic lower bound for the class of all meshes. Our method uses predictive techniques enabled by the Spirale Reversi decoding algorithm. 1 Introduction Geometric data is typically represented by meshes, often triangular. Frequently, there is need to access such data via a network connection and, in such cases, bandwidth tends to become a serious obstacle to interactivity. An obvious way out of this problem is to use compressed representations. The standard representation of a triangular mesh consists of two parts: connectivity and vertex coord...
Quantified Interference for a While Language
 QAPL 2004 PRELIMINARY VERSION
, 2004
"... We show how an information theoretic approach can quantify interference in a simple imperative language that includes a looping construct. In this paper we focus on a particular case of this definition of interference: leakage of information from private variables to public ones in While language pr ..."
Abstract

Cited by 43 (5 self)
 Add to MetaCart
(Show Context)
We show how an information theoretic approach can quantify interference in a simple imperative language that includes a looping construct. In this paper we focus on a particular case of this definition of interference: leakage of information from private variables to public ones in While language programs. The major result of the paper is a quantitative analysis for this language that employs a usedefinition graph to calculate bounds on the leakage into each variable.
The formal definition of reference priors
 ANN. STATIST
, 2009
"... Reference analysis produces objective Bayesian inference, in the sense that inferential statements depend only on the assumed model and the available data, and the prior distribution used to make an inference is least informative in a certain informationtheoretic sense. Reference priors have been r ..."
Abstract

Cited by 37 (2 self)
 Add to MetaCart
Reference analysis produces objective Bayesian inference, in the sense that inferential statements depend only on the assumed model and the available data, and the prior distribution used to make an inference is least informative in a certain informationtheoretic sense. Reference priors have been rigorously defined in specific contexts and heuristically defined in general, but a rigorous general definition has been lacking. We produce a rigorous general definition here and then show how an explicit expression for the reference prior can be obtained under very weak regularity conditions. The explicit expression can be used to derive new reference priors both analytically and numerically.
Heterogeneous Transfer Learning for Image Clustering via the Social Web
"... In this paper, we present a new learning scenario, heterogeneous transfer learning, which improves learning performance when the data can be in different feature spaces and where no correspondence between data instances in these spaces is provided. In the past, we have classified Chinese text docume ..."
Abstract

Cited by 32 (9 self)
 Add to MetaCart
(Show Context)
In this paper, we present a new learning scenario, heterogeneous transfer learning, which improves learning performance when the data can be in different feature spaces and where no correspondence between data instances in these spaces is provided. In the past, we have classified Chinese text documents using English training data under the heterogeneous transfer learning framework. In this paper, we present image clustering as an example to illustrate how unsupervised learning can be improved by transferring knowledge from auxiliary heterogeneous data obtained from the social Web. Image clustering is useful for image sense disambiguation in querybased image search, but its quality is often low due to imagedata sparsity problem. We extend PLSA to help transfer the knowledge from social Web data, which have mixed feature representations. Experiments on imageobject clustering and scene clustering tasks show that our approach in heterogeneous transfer learning based on the auxiliary data is indeed effective and promising. 1