Results 1  10
of
98
Mobility increases the capacity of adhoc wireless networks
 IEEE/ACM Transactions on Networking
, 2002
"... The capacity of adhoc wireless networks is constrained by the mutual interference of concurrent transmissions between nodes. We study a model of an adhoc network where n nodes communicate in random sourcedestination pairs. These nodes are assumed to be mobile. We examine the persession throughpu ..."
Abstract

Cited by 836 (5 self)
 Add to MetaCart
The capacity of adhoc wireless networks is constrained by the mutual interference of concurrent transmissions between nodes. We study a model of an adhoc network where n nodes communicate in random sourcedestination pairs. These nodes are assumed to be mobile. We examine the persession throughput for applications with loose delay constraints, such that the topology changes over the timescale of packet delivery. Under this assumption, the peruser throughput can increase dramatically when nodes are mobile rather than fixed. This improvement can be achieved by exploiting node mobility as a type of multiuser diversity. 1
Practical network support for IP traceback
, 2000
"... This paper describes a technique for tracing anonymous packet flooding attacks in the Internet back towards their source. This work is motivated by the increased frequency and sophistication of denialofservice attacks and by the difficulty in tracing packets with incorrect, or “spoofed”, source ad ..."
Abstract

Cited by 530 (12 self)
 Add to MetaCart
This paper describes a technique for tracing anonymous packet flooding attacks in the Internet back towards their source. This work is motivated by the increased frequency and sophistication of denialofservice attacks and by the difficulty in tracing packets with incorrect, or “spoofed”, source addresses. In this paper we describe a general purpose traceback mechanism based on probabilistic packet marking in the network. Our approach allows a victim to identify the network path(s) traversed by attack traffic without requiring interactive operational support from Internet Service Providers (ISPs). Moreover, this traceback can be performed “postmortem ” – after an attack has completed. We present an implementation of this technology that is incrementally deployable, (mostly) backwards compatible and can be efficiently implemented using conventional technology. 1.
Designing Efficient And Accurate Parallel Genetic Algorithms
, 1999
"... Parallel implementations of genetic algorithms (GAs) are common, and, in most cases, they succeed to reduce the time required to find acceptable solutions. However, the effect of the parameters of parallel GAs on the quality of their search and on their efficiency are not well understood. This insuf ..."
Abstract

Cited by 220 (5 self)
 Add to MetaCart
Parallel implementations of genetic algorithms (GAs) are common, and, in most cases, they succeed to reduce the time required to find acceptable solutions. However, the effect of the parameters of parallel GAs on the quality of their search and on their efficiency are not well understood. This insufficient knowledge limits our ability to design fast and accurate parallel GAs that reach the desired solutions in the shortest time possible. The goal of this dissertation is to advance the understanding of parallel GAs and to provide rational guidelines for their design. The research reported here considered three major types of parallel GAs: simple masterslave algorithms with one population, more sophisticated algorithms with multiple populations, and a hierarchical combination of the first two types. The investigation formulated simple models that predict accurately the quality of the solutions with different parameter settings. The quality predictors were transformed into populationsizing equations, which in turn were used to estimate the execution time of the algorithms.
The Gambler's Ruin Problem, Genetic Algorithms, and the Sizing of Populations
, 1997
"... This paper presents a model for predicting the convergence quality of genetic algorithms. The model incorporates previous knowledge about decision making in genetic algorithms and the initial supply of building blocks in a novel way. The result is an equation that accurately predicts the quality of ..."
Abstract

Cited by 210 (88 self)
 Add to MetaCart
This paper presents a model for predicting the convergence quality of genetic algorithms. The model incorporates previous knowledge about decision making in genetic algorithms and the initial supply of building blocks in a novel way. The result is an equation that accurately predicts the quality of the solution found by a GA using a given population size. Adjustments for different selection intensities are considered and computational experiments demonstrate the effectiveness of the model. I. Introduction The size of the population in a genetic algorithm (GA) is a major factor in determining the quality of convergence. The question of how to choose an adequate population size for a particular domain is difficult and has puzzled GA practitioners for a long time. Hard questions are better approached using a divideandconquer strategy and the population sizing issue is no exception. In this case, we can identify two factors that influence convergence quality: the initial supply of build...
A Statistical Learning Method for Logic Programs with Distribution Semantics
 IN PROCEEDINGS OF THE 12TH INTERNATIONAL CONFERENCE ON LOGIC PROGRAMMING (ICLP’95
, 1995
"... When a joint distribution PF is given to a set F of facts in a logic program DB = F U R where R is a set of rules, we can further extend it to a joint distribution PDB over the set of possible least models of DB. We then define the semantics of DB with the associated distribution PF as PDB, and call ..."
Abstract

Cited by 95 (23 self)
 Add to MetaCart
When a joint distribution PF is given to a set F of facts in a logic program DB = F U R where R is a set of rules, we can further extend it to a joint distribution PDB over the set of possible least models of DB. We then define the semantics of DB with the associated distribution PF as PDB, and call it distribution semantics. While the
Theoretical Risks and Tabular Asterisks: Sir Karl and Sir Ronald and The Slow progress OF SOFT PSYCHOLOGY
 J CONSULTING AND CLINICAL PSYCHOLOGY
, 1978
"... Theories in “soft” areas of psychology lack the cumulative character of scientific knowledge. They tend neither to be refuted nor corroborated, but instead merely fade away as people lose interest. Even though intrinsic subject matter difficulties (20 listed) contribute to this, the excessive relian ..."
Abstract

Cited by 68 (3 self)
 Add to MetaCart
Theories in “soft” areas of psychology lack the cumulative character of scientific knowledge. They tend neither to be refuted nor corroborated, but instead merely fade away as people lose interest. Even though intrinsic subject matter difficulties (20 listed) contribute to this, the excessive reliance on significance testing is partly responsible, being a poor way of doing science. Karl Popper’s approach, with modifications, would be prophylactic. Since the null hypothesis is quasialways false, tables summarizing research in terms of patterns of “significant differences” are little more than complex, causally uninterpretable outcomes of statistical power functions. Multiple paths to estimating numerical point values (“consistency tests”) are better, even if approximate with rough tolerances; and lacking this, ranges, orderings, secondorder differences, curve peaks and valleys, and function forms should be used. Such methods are usual in developed sciences that seldom report statistical significance. Consistency tests of a conjectural taxometric model yielded 94 % success with zero false negatives.
On the Sample Complexity of Learning Bayesian Networks
, 1996
"... In recent years there has been an increasing interest in learning Bayesian networks from data. One of the most effective methods for learning such networks is based on the minimum description length (MDL) principle. Previous work has shown that this learning procedure is asymptotically successful: w ..."
Abstract

Cited by 45 (2 self)
 Add to MetaCart
In recent years there has been an increasing interest in learning Bayesian networks from data. One of the most effective methods for learning such networks is based on the minimum description length (MDL) principle. Previous work has shown that this learning procedure is asymptotically successful: with probability one, it will converge to the target distribution, given a sufficient number of samples. However, the rate of this convergence has been hitherto unknown. In this work we examine the sample complexity of MDL based learning procedures for Bayesian networks. We show that the number of samples needed to learn an fflclose approximation (in terms of entropy distance) with confidence ffi is O i ( 1 ffl ) 4 3 log 1 ffl log 1 ffi log log 1 ffi j . This means that the sample complexity is a loworder polynomial in the error threshold and sublinear in the confidence bound. We also discuss how the constants in this term depend on the complexity of the target distribution. F...
Impact of Synaptic Unreliability on the Information Transmitted by Spiking Neurons
 J. Neurophysiol
, 1998
"... this paper, we use simple biophysical models dent unreliable synapses provides the drive to an integrateandfire of spike transduction and stochastic synaptic release to ex neuron. Within this model, the mutual information between the plore the implications of synaptic unreliability on informa sy ..."
Abstract

Cited by 40 (1 self)
 Add to MetaCart
this paper, we use simple biophysical models dent unreliable synapses provides the drive to an integrateandfire of spike transduction and stochastic synaptic release to ex neuron. Within this model, the mutual information between the plore the implications of synaptic unreliability on informa synaptic drive and the resulting output spike train can be computed tion transmission and neural coding in the cortex. Our goal exactly from distributions that depend only on a single variable, is to provide a quantitative answer to the question: How the interspike interval. The reduction of the calculation to depen much information can the output spike train provide about dence on only a single variable greatly reduces the amount of data the synaptic inputs? Our answer will be cast in an informa required to obtain reliable information estimates. We consider two tiontheoretic framework. factors that govern the rate of information transfer: the synaptic reliability and the number of synapses connecting each presynaptic METHODS
Exponential functionals of Lévy processes
 Probabilty Surveys
, 2005
"... Abstract: This text surveys properties and applications of the exponential functional ∫ t exp(−ξs)ds of realvalued Lévy processes ξ = (ξt, t ≥ 0). 0 ..."
Abstract

Cited by 34 (4 self)
 Add to MetaCart
Abstract: This text surveys properties and applications of the exponential functional ∫ t exp(−ξs)ds of realvalued Lévy processes ξ = (ξt, t ≥ 0). 0