Results 1  10
of
18
Coordination of Groups of Mobile Autonomous Agents Using Nearest Neighbor Rules
, 2002
"... In a recent Physical Review Letters paper, Vicsek et. al. propose a simple but compelling discretetime model of n autonomous agents fi.e., points or particlesg all moving in the plane with the same speed but with dierent headings. Each agent's heading is updated using a local rule based on the a ..."
Abstract

Cited by 604 (44 self)
 Add to MetaCart
In a recent Physical Review Letters paper, Vicsek et. al. propose a simple but compelling discretetime model of n autonomous agents fi.e., points or particlesg all moving in the plane with the same speed but with dierent headings. Each agent's heading is updated using a local rule based on the average of its own heading plus the headings of its \neighbors." In their paper, Vicsek et. al. provide simulation results which demonstrate that the nearest neighbor rule they are studying can cause all agents to eventually move in the same direction despite the absence of centralized coordination and despite the fact that each agent's set of nearest neighbors change with time as the system evolves. This paper provides a theoretical explanation for this observed behavior. In addition, convergence results are derived for several other similarly inspired models.
Strong Uniform Times and Finite Random Walks
 ADVANCES IN APPLIED MATHEMATICS 8,6997 (1987)
, 1987
"... There are several techniques for obtaining bounds on the rate of convergence to the stationary distribution for Markov chains with strong symmetry properties, in particular random walks on finite groups. An elementary method, strong uniform times, is often effective. We prove such times always exist ..."
Abstract

Cited by 59 (7 self)
 Add to MetaCart
There are several techniques for obtaining bounds on the rate of convergence to the stationary distribution for Markov chains with strong symmetry properties, in particular random walks on finite groups. An elementary method, strong uniform times, is often effective. We prove such times always exist, and relate this method to coupling and Fourier analysis.
Approximating center points with iterated Radon points
 Internat. J. Comput. Geom. Appl
, 1996
"... We give a practical and provably good Monte Carlo algorithm for approximating center points. Let P be a set of n points in IR d. A point c ∈ IR d is a βcenter point of P if every closed halfspace containing c contains at least βn points of P. Every point set has a 1/(d + 1)center point; our algori ..."
Abstract

Cited by 55 (10 self)
 Add to MetaCart
We give a practical and provably good Monte Carlo algorithm for approximating center points. Let P be a set of n points in IR d. A point c ∈ IR d is a βcenter point of P if every closed halfspace containing c contains at least βn points of P. Every point set has a 1/(d + 1)center point; our algorithm finds an Ω(1/d 2)center point with high probability. Our algorithm has a small constant factor and is the first approximate center point algorithm whose complexity is subexponential in d. Moreover, it can be optimally parallelized to require O(log 2 d log log n) time. Our algorithm has been used in mesh partitioning methods and can be used in the construction of high breakdown estimators for multivariate datasets in statistics. It has the potential to improve results in practice for constructing weak ɛnets. We derive a variant of our algorithm whose time bound is fully polynomial in d and linear in n, and show how to combine our approach with previous techniques to compute high quality center points more quickly. 1
Studying Recommendation Algorithms by Graph Analysis
, 2003
"... We present a novel framework for studying recommendation algorithms in terms of the `jumps' that they make to connect people to artifacts. This approach emphasizes reachability via an algorithm within the implicit graph structure underlying a recommender dataset and allows us to consider question ..."
Abstract

Cited by 24 (0 self)
 Add to MetaCart
We present a novel framework for studying recommendation algorithms in terms of the `jumps' that they make to connect people to artifacts. This approach emphasizes reachability via an algorithm within the implicit graph structure underlying a recommender dataset and allows us to consider questions relating algorithmic parameters to properties of the datasets. For instance, given a particular algorithm `jump,' what is the average path length from a person to an artifact? Or, what choices of minimum ratings and jumps maintain a connected graph? We illustrate the approach with a common jump called the `hammock' using movie recommender datasets.
Sampling Based SensorNetwork Deployment
"... In this paper, we consider the problem of placing networked sensors in a way that guarantees coverage and connectivity. We focus on sampling based deployment and present algorithms that guarantee coverage and connectivity with a small number of sensors. We consider two different scenarios based on t ..."
Abstract

Cited by 22 (0 self)
 Add to MetaCart
In this paper, we consider the problem of placing networked sensors in a way that guarantees coverage and connectivity. We focus on sampling based deployment and present algorithms that guarantee coverage and connectivity with a small number of sensors. We consider two different scenarios based on the flexibility of deployment. If deployment has to be accomplished in one step, like airborne deployment, then the main question becomes how many sensors are needed. If deployment can be implemented in multiple steps, then awareness of coverage and connectivity can be updated. For this case, we present incremental deployment algorithms which consider the current placement to adjust the sampling domain. The algorithms are simple, easy to implement, and require a small number of sensors. We believe the concepts and algorithms presented in this paper will provide a unifying framework for existing and future deployment algorithms which consider many practical issues not considered in the present work.
Probabilistic bounds on the coefficients of polynomials with only real zeros
 J. Combin. Theory Ser. A
, 1997
"... The work of Harper and subsequent authors has shown that nite sequences (a 0;;an) arising from combinatorial problems are often such that the polynomial A(z): = P n k=0 akz k has only real zeros. Basic examples include rows from the arrays of binomial coe cients, Stirling numbers of the rst and sec ..."
Abstract

Cited by 20 (0 self)
 Add to MetaCart
The work of Harper and subsequent authors has shown that nite sequences (a 0;;an) arising from combinatorial problems are often such that the polynomial A(z): = P n k=0 akz k has only real zeros. Basic examples include rows from the arrays of binomial coe cients, Stirling numbers of the rst and second kinds, and Eulerian numbers. Assuming the ak are nonnegative, A(1)> 0 and that A(z) is not constant, it is known that A(z) has only real zeros i the normalized sequence (a 0=A(1);;an=A(1)) is the probability distribution of the Research supported in part by N.S.F. Grant MCS9404345 1 number of successes in n independent trials for some sequence of success probabilities. Such sequences (a 0;;an) are also known to be characterized by total positivity of the in nite matrix (ai,j) indexed by nonnegative integers i and j. This papers reviews inequalities and approximations for such sequences, called Polya frequency sequences which follow from their probabilistic representation. In combinatorial examples these inequalities yield a number of improvements of known estimates.
Jumping Connections: A GraphTheoretic Model for Recommender Systems
 MASTER’S THESIS, VIRGINIA TECH
, 2001
"... Recommender systems have become paramount to customize information access and reduce information overload. They serve multiple uses, ranging from suggesting products and artifacts (to consumers), to bringing people together by the connections induced by (similar) reactions to products and services. ..."
Abstract

Cited by 15 (6 self)
 Add to MetaCart
Recommender systems have become paramount to customize information access and reduce information overload. They serve multiple uses, ranging from suggesting products and artifacts (to consumers), to bringing people together by the connections induced by (similar) reactions to products and services. This thesis presents a graphtheoretic model that casts recommendation as a process of ‘jumping connections’ in a graph. In addition to emphasizing the social network aspect, this viewpoint provides a novel evaluation criterion for recommender systems. Algorithms for recommender systems are distinguished not in terms of predicted ratings of services/artifacts, but in terms of the combinations of people and artifacts that they bring together. We present an algorithmic framework drawn from random graph theory and outline an analysis for one particular form of jump called a ‘hammock.’ Experimental results on two datasets collected over the Internet demonstrate the validity of this approach.
Process Physics: From Quantum Foam to General Relativity
"... Progress in the new informationtheoretic process physics 1 is reported in which the link to the phenomenology of general relativity is made. In process physics the fundamental assumption is that reality is to be modelled as selforganising semantic (or internal or relational) information using a se ..."
Abstract

Cited by 13 (9 self)
 Add to MetaCart
Progress in the new informationtheoretic process physics 1 is reported in which the link to the phenomenology of general relativity is made. In process physics the fundamental assumption is that reality is to be modelled as selforganising semantic (or internal or relational) information using a selfreferentially limited neural network model. Previous progress in process physics included the demonstration that space and quantum physics are emergent and unified, with time a distinct nongeometric process, that quantum phenomena are caused by fractal topological defects embedded in and forming a growing threedimensional fractal processspace, which is essentially a quantum foam. Other features of the emergent physics were: quantum field theory with emergent flavour and confined colour, limited causality and the Born quantum measurement metarule, inertia, timedilation effects, gravity and the equivalence principle, a growing universe with a cosmological constant, black holes and event horizons, and the emergence of classicality. Here general relativity and the technical language of general covariance is seen not to be fundamental but a phenomenological construct, arising as an amalgam of two distinct phenomena: the ‘gravitational’ characteristics of the emergent quantum foam for which ‘matter ’ acts as a sink, and the classical ‘spacetime ’ measurement protocol, but with the later violated by quantum measurement processes. Quantum gravity, as manifested in the emergent Quantum Homotopic Field Theory of the processspace or quantum foam, is logically prior to the emergence of the general relativity phenomenology, and cannot be derived from it.
Order statistics for decomposable combinatorial structures
 Random Structures and Algorithms
, 1994
"... Summary. In this paper we consider the component structure of decomposable combinatorial objects, both labeled and unlabeled, from a probabilistic point of view. In both cases we show that when the generating function for the components of a structure is a logarithmic function, then the joint distr ..."
Abstract

Cited by 13 (3 self)
 Add to MetaCart
Summary. In this paper we consider the component structure of decomposable combinatorial objects, both labeled and unlabeled, from a probabilistic point of view. In both cases we show that when the generating function for the components of a structure is a logarithmic function, then the joint distribution of the normalized order statistics of the component sizes of a random object of size n converges to the PoissonDirichlet distribution on the simplex ∇ = {{xi} : � xi =1,x1 ≥ x2 ≥... ≥ 0}. This result complements recent results obtained by Flajolet and Soria [9] on the total number of components in a random combinatorial structure.
Characteristics of the Synchronization of Brain Activity Imposed by Finite Conduction Velocities of Axons
, 2000
"... The electrical activity of neurons in brains fluctuates erratically both in terms of pulse trains of single neurons and the dendritic currents of populations of neurons. Obviously the neurons interact with one another in the production of intelligent behavior, so it is reasonable to expect to find e ..."
Abstract

Cited by 8 (4 self)
 Add to MetaCart
The electrical activity of neurons in brains fluctuates erratically both in terms of pulse trains of single neurons and the dendritic currents of populations of neurons. Obviously the neurons interact with one another in the production of intelligent behavior, so it is reasonable to expect to find evidence for varying degrees of synchronization of their pulse trains and dendritic currents in relation to behavior. However, synaptic communication between neurons depends on propagation of action potentials between neurons, often with appreciable distances between them, and the transmission delays are not compatible with synchronization in any simple way. Evidence is on hand showing that the principal form of synchrony is by establishment of a low degree of covariance among very large numbers of otherwise autonomous neurons, which allows for rapid state transitions of neural populations between successive chaotic basins of attraction along itinerant trajectories. The small fraction of covariant activity is extracted by spatial integration upon axonal transmission over divergentconvergent pathways, through which a remarkable improvement in signal:noise ratio is achieved. The raw traces of local activity show little evidence for synchrony, other than zerolag correlation, which appears to be largely a statistical artifact. Brains rely less on tight phaselocking of small numbers of repetitively firing neurons and more on low degrees of cooperativity achieved by order parameters influencing very large numbers of neurons. Brains appear to be indifferent to and undisturbed by widely varying time and phase relations between individual neurons and even large semiautonomous areas of cortex comprising their cooperative neural masses.