Results 1  10
of
64
Modulation and Information Hiding in Images
, 1996
"... We use concepts from communication theory to characterize information hiding schemes: the amount of information that can be hidden, its perceptibility, and its robustness to removal can be modeled using the quantities channel capacity, signaltonoise ratio, and jamming margin. We then introduce new ..."
Abstract

Cited by 92 (1 self)
 Add to MetaCart
We use concepts from communication theory to characterize information hiding schemes: the amount of information that can be hidden, its perceptibility, and its robustness to removal can be modeled using the quantities channel capacity, signaltonoise ratio, and jamming margin. We then introduce new information hiding schemes whose parameters can easily be adjusted to trade off capacity, imperceptibility, and robustness as required in the application. The theory indicates the most aggressive feasible parameter settings. We also introduce a technique called predistortion for increasing resistance to JPEG compression. Analogous tactics are presumably possible whenever a model of anticipated distortion is available.
Evolution of biological complexity
 Proceedings of the National Academy of Sciences
, 2000
"... In order to make a case for or against a trend in the evolution of complexity in biological evolution, complexity needs to be both rigorously defined and measurable. A recent informationtheoretic (but intuitively evident) definition identifies genomic complexity with the amount of information a seq ..."
Abstract

Cited by 62 (17 self)
 Add to MetaCart
In order to make a case for or against a trend in the evolution of complexity in biological evolution, complexity needs to be both rigorously defined and measurable. A recent informationtheoretic (but intuitively evident) definition identifies genomic complexity with the amount of information a sequence stores about its environment. We investigate the evolution of genomic complexity in populations of digital organisms and monitor in detail the evolutionary transitions that increase complexity. We show that because natural selection forces genomes to behave as a natural “Maxwell Demon”, within a fixed environment genomic complexity is forced to increase. Darwinian evolution is a simple yet powerful process that requires only a population of reproducing organisms in which each offspring has the potential for a heritable variation from its parent. This principle governs evolution in the natural world, and has gracefully produced organisms of vast complexity. Still, whether or not complexity increases through evolution has become a contentious issue. Gould [1] for example argues that any recognizable trend can be explained by the “drunkard’s walk ” model, where “progress ” is due simply to a fixed boundary condition. McShea [2] investigates trends in the evolution of certain types of structural and functional complexity, and finds some evidence of a trend but nothing conclusive. In fact, he concludes that “Something may be increasing. But is it complexity? ” Bennett [3], on the
Covert Channels  Here to stay?
"... We discuss the difficulties of satisfying highassurance system requirements without sacrificing system capabilities. To alleviate this problem, we show how tradeoffs can be made to reduce the threat of coved channels. We also clarify certain concepts in the theory of covert channels. Traditionally ..."
Abstract

Cited by 54 (11 self)
 Add to MetaCart
We discuss the difficulties of satisfying highassurance system requirements without sacrificing system capabilities. To alleviate this problem, we show how tradeoffs can be made to reduce the threat of coved channels. We also clarify certain concepts in the theory of covert channels. Traditionally, a coved channel’s vulnerability was measured by the capacity. We show why a capacity analysis alone is not sufficient to evaluate the vulnerability and introduce a new metric referred to as the “small message criterion”.
Computational mechanics: Pattern and prediction, structure and simplicity
 Journal of Statistical Physics
, 1999
"... Computational mechanics, an approach to structural complexity, defines a process’s causal states and gives a procedure for finding them. We show that the causalstate representation—an Emachine—is the minimal one consistent with ..."
Abstract

Cited by 43 (8 self)
 Add to MetaCart
Computational mechanics, an approach to structural complexity, defines a process’s causal states and gives a procedure for finding them. We show that the causalstate representation—an Emachine—is the minimal one consistent with
A Network Pump
, 1995
"... A designer of reliable multilevel secure (MLS) networks must consider covert channels and denial of service attacks in addition to traditional network performance measures such as throughput, fairness, and reliability. In this paper we show how to extend the NRL data Pump to a certain MLS network a ..."
Abstract

Cited by 42 (8 self)
 Add to MetaCart
A designer of reliable multilevel secure (MLS) networks must consider covert channels and denial of service attacks in addition to traditional network performance measures such as throughput, fairness, and reliability. In this paper we show how to extend the NRL data Pump to a certain MLS network architecture in order to balance the requirements of congestion control, fairness, good performance, and reliability against those of minimal threats from covert channels and denial of service attacks. We back up our claims with simulation results.
Simple Timing Channels
 Proceedings 1994 IEEE Computer Society Symposium on Research in Security and Privacy
, 1994
"... the proof of Corollary 1.1 in the actual published paper. We havechanged the bottom index of the second sum from a0toa1. As of Sept. 21, 1994 another typo has been xed. page 60, column 2, beginning of line 9 should read C T(a;a+d) instead of T (a; a + d). ..."
Abstract

Cited by 31 (9 self)
 Add to MetaCart
the proof of Corollary 1.1 in the actual published paper. We havechanged the bottom index of the second sum from a0toa1. As of Sept. 21, 1994 another typo has been xed. page 60, column 2, beginning of line 9 should read C T(a;a+d) instead of T (a; a + d).
Predictability, Complexity, and Learning
, 2001
"... We define predictive information Ipred(T) as the mutual information between the past and the future of a time series. Three qualitatively different behaviors are found in the limit of large observation times T: Ipred(T) can remain finite, grow logarithmically, or grow as a fractional power law. If t ..."
Abstract

Cited by 30 (2 self)
 Add to MetaCart
We define predictive information Ipred(T) as the mutual information between the past and the future of a time series. Three qualitatively different behaviors are found in the limit of large observation times T: Ipred(T) can remain finite, grow logarithmically, or grow as a fractional power law. If the time series allows us to learn a model with a finite number of parameters, then Ipred(T) grows logarithmically with a coefficient that counts the dimensionality of the model space. In contrast, powerlaw growth is associated, for example, with the learning of infinite parameter (or nonparametric) models such as continuous functions with smoothness constraints. There are connections between the predictive information and measures of complexity that have been defined both in learning theory and the analysis of physical systems through statistical mechanics and dynamical systems theory. Furthermore, in the same way that entropy provides the unique measure of available information consistent with some simple and plausible conditions, we argue that the divergent part of Ipred(T) provides the unique measure for the complexity of dynamics underlying a time series. Finally, we discuss how these ideas may be useful in problems in physics, statistics, and biology.
A network version of the pump
 Proc. of the IEEE Symposium on Research in Security and Privacy
, 1995
"... A designer of reliable MLS networks must consider covert channels and denial of service attacks in addition to traditional network performance measures such as throughput, fairness, and reliability. In this paper we show how to extend the NRL data Pump to a certain MLS network architecture in order ..."
Abstract

Cited by 26 (7 self)
 Add to MetaCart
A designer of reliable MLS networks must consider covert channels and denial of service attacks in addition to traditional network performance measures such as throughput, fairness, and reliability. In this paper we show how to extend the NRL data Pump to a certain MLS network architecture in order to balance the requirements of congestion control, fairness, good performance, and reliability against those of minimal threats from covert channels and denial of service attacks. We back up our claims with simulation results. 1
Reconstructing attractors from scalar time series: a comparison of singular system and redundancy criteria
 Physica D
, 1989
"... A delayvector phase space reconstruction in which the delay time satisfies a minimum redundan ~ criterion is compared with a reconstruction obtained using a singular system approach, Minimum redundancy produces the better reconstruction. The reconstructions are compared using a distortion functiona ..."
Abstract

Cited by 19 (0 self)
 Add to MetaCart
A delayvector phase space reconstruction in which the delay time satisfies a minimum redundan ~ criterion is compared with a reconstruction obtained using a singular system approach, Minimum redundancy produces the better reconstruction. The reconstructions are compared using a distortion functional. ~ which measures how well the location of a point in the original phase space can be determ~i~ed on the basis of its image under the reconstruction process. The superiority of the redundancy analysis over the singular system analysis ~s found to arise from the former's foundation on the notion of general independence as opposed to the latter's foundation on the notion of linear independence. 1. Inlroduetion The observation that the dynamics of a system with many degrees of freedom can be investigated using time series of a single scalar observable has broadened the class of experiments in which complex behavior can be interpreted as manifestations of strange attractors, Packard et al. [1] suggested two schemes for reconstructing vector dynamics from scalar time series. Takeas [2] suggested the
Testing For Nonlinearity Using Redundancies: Quantitative and Qualitative Aspects
 Physica D
, 1995
"... A method for testing nonlinearity in time series is described based on informationtheoretic functionals  redundancies, linear and nonlinear forms of which allow either qualitative, or, after incorporating the surrogate data technique, quantitative evaluation of dynamical properties of scrutinized ..."
Abstract

Cited by 19 (7 self)
 Add to MetaCart
A method for testing nonlinearity in time series is described based on informationtheoretic functionals  redundancies, linear and nonlinear forms of which allow either qualitative, or, after incorporating the surrogate data technique, quantitative evaluation of dynamical properties of scrutinized data. An interplay of quantitative and qualitative testing on both the linear and nonlinear levels is analyzed and robustness of this combined approach against spurious nonlinearity detection is demonstrated. Evaluation of redundancies and redundancybased statistics as functions of time lag and embedding dimension can further enhance insight into dynamics of a system under study. Keywords: time series, nonlinearity, mutual information, redundancy, surrogate data 1 Introduction The problem of inferring the dynamics of a system from measured data is a perpetual challenge for time series analysts. Ideas and concepts from nonlinear dynamics and theory of deterministic chaos have led to a num...